The Facebook algorithm designers chose to let us see what our friends are talking about. They chose to show us, in some sense, more of the same. And that is the design decision that they could have decided differently. They could have said, "We're going to show you stuff that you've probably never seen before." I think they probably optimized their algorithm to make the most amount of money, and that probably meant showing people stuff that they already sort of agreed with, or were more likely to agree with.
Cathy O'NeilThere are lots of different ways that algorithms can go wrong, and what we have now is a system in which we assume because it's shiny new technology with a mathematical aura that it's perfect and it doesn't require further vetting. Of course, we never have that assumption with other kinds of technology.
Cathy O'NeilWhen people are not given an option by some secret scoring system, it's very hard to complain, so they often don't even know that they've been victimized.
Cathy O'NeilMost people don't have any association in their minds with what they do and with ethics. They think they somehow moved past the questions of morality or values or ethics, and that's something that I've never imagined to be true.
Cathy O'Neil