[Discussion] Examples of mis-specifying optimization objectives causing unpleasant outcomes
In Stewart Russell’s book ‘Human Compatible’, he gave an example of social platform specifying maximization of click-through rate as objective, which did not only promote echo chamber effect, but in fact slowly modifying people’s preference so we become more predictable. In the process, driving more extreme viewpoints, because it is easier to predict what content will be clicked through when your view points are extreme to any one side of the spectrum.
I find this example complex and interesting, and am wondering what are other real-world examples?
submitted by /u/dbcrib
[link] [comments]