More on this book
Community
Kindle Notes & Highlights
by
Cathy O'Neil
Read between
May 12 - May 17, 2022
That probability is distilled into a score, which can turn someone’s life upside down. And yet when the person fights back, “suggestive” countervailing evidence simply won’t cut it. The case must be ironclad. The human victims of WMDs, we’ll see time and again, are held to a far higher standard of evidence than the algorithms themselves.
Ill-conceived mathematical models now micromanage the economy, from advertising to prisons. These WMDs have many of the same characteristics as the value-added model that derailed Sarah Wysocki’s career in Washington’s public schools. They’re opaque, unquestioned, and unaccountable, and they operate at a scale to sort, target, or “optimize” millions of people. By confusing their findings with on-the-ground reality, most of them create pernicious WMD feedback loops.
Here we see that models, despite their reputation for impartiality, reflect goals and ideology. When I removed the possibility of eating Pop-Tarts at every meal, I was imposing my ideology on the meals model. It’s something we do without a second thought. Our own values and desires influence our choices, from the data we choose to collect to the questions we ask. Models are opinions embedded in mathematics.
models as simple as a smoke alarm on their fellow humans. Racism, at the individual level, can be seen as a predictive model whirring away in billions of human minds around the world. It is built from faulty, incomplete, or generalized data. Whether it comes from experience or hearsay, the data indicates that certain types of people have behaved badly. That generates a binary prediction that all people of that race will behave that same way.
the three elements of a WMD: Opacity, Scale, and Damage.
when you create a model from proxies, it is far simpler for people to game it.
we criminalize poverty, believing all the while that our tools are not only scientific but fair.
What we found, to no great surprise, was that an overwhelming majority of these encounters—about 85 percent—involved young African American or Latino men. In certain neighborhoods, many of them were stopped repeatedly. Only 0.1 percent, or one of one thousand stopped, was linked in any way to a violent crime. Yet this filter captured many others for lesser crimes, from drug possession to underage drinking, that might have otherwise gone undiscovered. Some of the targets, as you might expect, got angry, and a good number of those found themselves charged with resisting arrest.
companies can get in trouble for screening out applicants on the basis of such questions. Regulators in Rhode Island found that CVS Pharmacy was illegally screening out applicants with mental illnesses when a personality test required respondents to agree or disagree to such statements as “People do a lot of things that make you angry” and “There’s no use having close friends; they always let you down.”
As of 2015, white households held on average roughly ten times as much money and property as black and Hispanic households. And while only 15 percent of whites had zero or negative net worth, more than a third of blacks and Hispanic households found themselves with no cushion.
By their sixties, whites are eleven times richer than African Americans.
An innocent person whose name resembles that of a suspected terrorist faces a hellish ordeal every time he has to get on a plane. (Wealthy travelers, by contrast, are often able to pay to acquire “trusted traveler” status, which permits them to waltz through security. In effect, they’re spending money to shield themselves from a WMD.)
the entire political system—the money, the attention, the fawning—turns to targeted voters like a flower following the sun. The rest of us are virtually ignored (except for fund-raising come-ons). The programs have already predicted our voting behavior, and any attempt to change it is not worth the investment.
decision making, while often flawed, has one chief virtue. It can evolve.
Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that’s something only humans can provide. We have to explicitly embed better values into our algorithms, creating Big Data models that follow our ethical lead. Sometimes that will mean putting fairness ahead of profit.