More on this book
Community
Kindle Notes & Highlights
by
Cathy O'Neil
Read between
March 29 - April 1, 2023
In WMDs, many poisonous assumptions are camouflaged by math and go largely untested and unquestioned.
The privileged, we’ll see time and again, are processed more by people, the masses by machines.
The human victims of WMDs, we’ll see time and again, are held to a far higher standard of evidence than the algorithms themselves.
A model’s blind spots reflect the judgments and priorities of its creators.
Here we see that models, despite their reputation for impartiality, reflect goals and ideology. When I removed the possibility of eating Pop-Tarts at every meal, I was imposing my ideology on the meals model. It’s something we do without a second thought. Our own values and desires influence our choices, from the data we choose to collect to the questions we ask. Models are opinions embedded in mathematics.
The ideal way to circumvent such prejudice is to consider applicants blindly. Orchestras, which had long been dominated by men, famously started in the 1970s to hold auditions with the musician hidden behind a sheet. Connections and reputations suddenly counted for nothing. Nor did the musician’s race or alma mater. The music from behind the sheet spoke for itself. Since then, the percentage of women playing in major orchestras has leapt by a factor of five—though they still make up only a quarter of the musicians.
In this march through a virtual lifetime, we’ve visited school and college, the courts and the workplace, even the voting booth. Along the way, we’ve witnessed the destruction caused by WMDs. Promising efficiency and fairness, they distort higher education, drive up debt, spur mass incarceration, pummel the poor at nearly every juncture, and undermine democracy. It might seem like the logical response is to disarm these weapons, one by one.
Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that’s something only humans can provide. We have to explicitly embed better values into our algorithms, creating Big Data models that follow our ethical lead. Sometimes that will mean putting fairness ahead of profit.

