The national conversation around white entitlement, around institutionalized racism, the Black Lives Matter movement, I think, came about in large part because of the widening and broadening of our understanding of inequality. That conversation was begun by Occupy.
Cathy O'NeilI know how models are built, because I build them myself, so I know that I'm embedding my values into every single algorithm I create and I am projecting my agenda onto those algorithms.
Cathy O'NeilI would argue that one of the major problems with our blind trust in algorithms is that we can propagate discriminatory patterns without acknowledging any kind of intent.
Cathy O'NeilSo much of our society as a whole is gearing us to maximize our salary or bonus. Basically, we just think in terms of money. Or, if not money, then, if you're in academia, it's prestige. It's a different kind of currency. And there's this unmeasured dimension of all jobs, which is whether it's improving the world.
Cathy O'NeilMy fantasy is that there is a new regulatory body that is in charge of algorithmic auditing.
Cathy O'NeilThere are lots of different ways that algorithms can go wrong, and what we have now is a system in which we assume because it's shiny new technology with a mathematical aura that it's perfect and it doesn't require further vetting. Of course, we never have that assumption with other kinds of technology.
Cathy O'Neil