By construction, the world of big data is siloed and segmented and segregated so that successful people, like myself - technologists, well-educated white people, for the most part - benefit from big data, and it's the people on the other side of the economic spectrum, especially people of color, who suffer from it. They suffer from it individually, at different times, at different moments. They never get a clear explanation of what actually happened to them because all these scores are secret and sometimes they don't even know they're being scored.
Cathy O'NeilThat's what we do when we work in Silicon Valley tech startups: We think about who's going to benefit from this. That's almost the only thing we think about.
Cathy O'NeilThe Facebook algorithm designers chose to let us see what our friends are talking about. They chose to show us, in some sense, more of the same. And that is the design decision that they could have decided differently. They could have said, "We're going to show you stuff that you've probably never seen before." I think they probably optimized their algorithm to make the most amount of money, and that probably meant showing people stuff that they already sort of agreed with, or were more likely to agree with.
Cathy O'NeilThere are lots of different ways that algorithms can go wrong, and what we have now is a system in which we assume because it's shiny new technology with a mathematical aura that it's perfect and it doesn't require further vetting. Of course, we never have that assumption with other kinds of technology.
Cathy O'NeilI think there's inherently an issue that models will literally never be able to handle, which is that when somebody comes along with a new way of doing something that's really excellent, the models will not recognize it. They only know how to recognize excellence when they can measure it somehow.
Cathy O'Neil