The Facebook algorithm designers chose to let us see what our friends are talking about. They chose to show us, in some sense, more of the same. And that is the design decision that they could have decided differently. They could have said, "We're going to show you stuff that you've probably never seen before." I think they probably optimized their algorithm to make the most amount of money, and that probably meant showing people stuff that they already sort of agreed with, or were more likely to agree with.
Cathy O'NeilI know how models are built, because I build them myself, so I know that I'm embedding my values into every single algorithm I create and I am projecting my agenda onto those algorithms.
Cathy O'NeilPeople felt like they were friends with Google, and they believed in the "Do No Evil" thing that Google said. They trusted Google more than they trusted the government, and I never understood that.
Cathy O'NeilWhen people are not given an option by some secret scoring system, it's very hard to complain, so they often don't even know that they've been victimized.
Cathy O'NeilBecause of my experience in Occupy, instead of asking the question, "Who will benefit from this system I'm implementing with the data?" I started to ask the question, "What will happen to the most vulnerable?" Or "Who is going to lose under this system? How will this affect the worst-off person?" Which is a very different question from "How does this improve certain people's lives?"
Cathy O'Neil