I don't think anybody's ever notified that they were sentenced to an extra two years because their recidivism score had been high, or notified that this beat cop happened to be in their neighborhood checking people's pockets for pot because of a predictive policing algorithm. That's just not how it works.
Cathy O'NeilThat's what we do when we work in Silicon Valley tech startups: We think about who's going to benefit from this. That's almost the only thing we think about.
Cathy O'NeilBecause of my experience in Occupy, instead of asking the question, "Who will benefit from this system I'm implementing with the data?" I started to ask the question, "What will happen to the most vulnerable?" Or "Who is going to lose under this system? How will this affect the worst-off person?" Which is a very different question from "How does this improve certain people's lives?"
Cathy O'NeilI would argue that one of the major problems with our blind trust in algorithms is that we can propagate discriminatory patterns without acknowledging any kind of intent.
Cathy O'NeilThere are lots of different ways that algorithms can go wrong, and what we have now is a system in which we assume because it's shiny new technology with a mathematical aura that it's perfect and it doesn't require further vetting. Of course, we never have that assumption with other kinds of technology.
Cathy O'Neil