More on this book
Community
Kindle Notes & Highlights
by
Cathy O'Neil
Read between
December 15 - December 15, 2022
“How do you justify evaluating people by a measure for which you are unable to provide explanation?” But that’s the nature of WMDs. The analysis is outsourced to coders and statisticians. And as a rule, they let the machines do the talking.
An algorithm processes a slew of statistics and comes up with a probability that a certain person might be a bad hire, a risky borrower, a terrorist, or a miserable teacher. That probability is distilled into a score, which can turn someone’s life upside down.
The human victims of WMDs, we’ll see time and again, are held to a far higher standard of evidence than the algorithms themselves.
Ill-conceived mathematical models now micromanage the economy, from advertising to prisons. These WMDs have many of
Baseball is an ideal home for predictive mathematical modeling. As Michael Lewis wrote in his 2003 bestseller, Moneyball,
Baseball models are fair, in part, because they’re transparent. Everyone has access to the stats and can understand more or less how they’re interpreted.
Our own values and desires influence our choices, from the data we choose to collect to the questions we ask. Models are opinions embedded in mathematics.
And a model built for today will work a bit worse tomorrow. It will grow stale if it’s not constantly updated. Prices change, as do people’s preferences. A model built for a six-year-old won’t work for a teenager.
Consequently, racism is the most slovenly of predictive models. It is powered by haphazard data gathering and spurious correlations, reinforced by institutional inequities, and polluted by confirmation bias. In this way, oddly enough, racism operates like many of the WMDs I’ll be describing in this book.
mathematical models, by their nature, are based on the past, and on the assumption that patterns will repeat.
The U.S. News college ranking has great scale, inflicts widespread damage, and generates an almost endless spiral of destructive feedback loops. While it’s not as opaque as many other models, it is still a bona fide WMD.
Between 1985 and 2013, the cost of higher education rose by more than 500 percent, nearly four times the rate of inflation. To attract top students, colleges, as we saw at TCU, have gone on building booms, featuring glass-walled student centers, luxury dorms, and gyms with climbing walls and whirlpool baths.
Between 2004 and 2014, for-profit enrollment tripled, and the industry now accounts for 11 percent of the country’s college and university students.
When it comes to WMDs, predatory ads practically define the genre. They zero in on the most desperate among us at enormous scale. In
destruction caused by WMDs. Promising efficiency and fairness, they distort higher education, drive up debt, spur mass incarceration, pummel the poor at nearly every juncture, and undermine democracy.
In the common view, the ills of poverty are more like a disease, and the effort—or at least the rhetoric—is to quarantine it and keep it from spreading to the middle class.
If a Big Data college application model had established itself in the early 1960s, we still wouldn’t have many women going to college, because it would have been trained largely on successful men.
Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that’s something only humans can provide. We have to explicitly embed better values into our algorithms, creating Big Data models that follow our ethical lead. Sometimes that will mean putting fairness ahead of profit.