People felt like they were friends with Google, and they believed in the "Do No Evil" thing that Google said. They trusted Google more than they trusted the government, and I never understood that.
Cathy O'NeilWith recidivism algorithms, for example, I worry about racist outcomes. With personality tests [for hiring], I worry about filtering out people with mental health problems from jobs. And with a teacher value-added model algorithm [used in New York City to score teachers], I worry literally that it's not meaningful. That it's almost a random number generator.
Cathy O'NeilWe don't let a car company just throw out a car and start driving it around without checking that the wheels are fastened on. We know that would result in death; but for some reason we have no hesitation at throwing out some algorithms untested and unmonitored even when they're making very important life-and-death decisions.
Cathy O'NeilAn insurance company might say, "Tell us more about yourself so your premiums can go down." When they say that, they're addressing the winners, not the losers.
Cathy O'NeilObviously the more transparency we have as auditors, the more we can get, but the main goal is to understand important characteristics about a black box algorithm without necessarily having to understand every single granular detail of the algorithm.
Cathy O'NeilThere are lots of different ways that algorithms can go wrong, and what we have now is a system in which we assume because it's shiny new technology with a mathematical aura that it's perfect and it doesn't require further vetting. Of course, we never have that assumption with other kinds of technology.
Cathy O'Neil