With recidivism algorithms, for example, I worry about racist outcomes. With personality tests [for hiring], I worry about filtering out people with mental health problems from jobs. And with a teacher value-added model algorithm [used in New York City to score teachers], I worry literally that it's not meaningful. That it's almost a random number generator.
Cathy O'NeilThere are lots of different ways that algorithms can go wrong, and what we have now is a system in which we assume because it's shiny new technology with a mathematical aura that it's perfect and it doesn't require further vetting. Of course, we never have that assumption with other kinds of technology.
Cathy O'NeilThat's what we do when we work in Silicon Valley tech startups: We think about who's going to benefit from this. That's almost the only thing we think about.
Cathy O'NeilI think what's happened is that the general public has become much more aware of the destructive power of Wall Street.
Cathy O'NeilI know how models are built, because I build them myself, so I know that I'm embedding my values into every single algorithm I create and I am projecting my agenda onto those algorithms.
Cathy O'Neil