Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
Rate it:
Open Preview
5%
Flag icon
The privileged, we’ll see time and again, are processed more by people, the masses by machines.
6%
Flag icon
The human victims of WMDs, we’ll see time and again, are held to a far higher standard of evidence than the algorithms themselves.
8%
Flag icon
Here we see that models, despite their reputation for impartiality, reflect goals and ideology. When I removed the possibility of eating Pop-Tarts at every meal, I was imposing my ideology on the meals model. It’s something we do without a second thought. Our own values and desires influence our choices, from the data we choose to collect to the questions we ask. Models are opinions embedded in mathematics.
10%
Flag icon
(Which isn’t to say that many players, come contract time, won’t quibble with a model’s valuations: “Sure I struck out two hundred times, but look at my home runs…”)
Saurabh
Why is this quibble not problematic?
11%
Flag icon
No, my model is benign, especially since it’s unlikely ever to leave my head and be formalized into code.
Saurabh
Not convinced
11%
Flag icon
A key component of this suffering is the pernicious feedback loop. As we’ve seen, sentencing models that profile a person by his or her circumstances help to create the environment that justifies their assumptions. This destructive loop goes round and round, and in the process the model becomes more and more unfair. The third question
Saurabh
So a good feedback loop is needed, not just any feedback loop
12%
Flag icon
So to sum up, these are the three elements of a WMD: Opacity, Scale, and Damage. All of
29%
Flag icon
but. In our largely segregated cities, geography is a highly effective proxy for race. If the purpose of the models is to prevent serious crimes,
30%
Flag icon
The result is that we criminalize poverty, believing all the while that our tools are not only scientific but fair.
32%
Flag icon
So fairness isn’t calculated into WMDs. And the result is massive, industrial production of unfairness. If you think of a WMD as a factory, unfairness is the black stuff belching out of the smoke stacks. It’s an emission, a toxic one.
32%
Flag icon
Justice cannot just be something that one part of society inflicts upon the other.
40%
Flag icon
Phrenology was a model that relied on pseudoscientific nonsense to make authoritative pronouncements, and for decades it went untested. Big Data can fall into the same trap. Models like the ones that red-lighted Kyle Behm and blackballed foreign medical students at St. George’s can lock people out, even when the “science” inside them is little more than a bundle of untested assumptions.
42%
Flag icon
The long and irregular hours also make it hard for workers to organize or to protest for better conditions. Instead, they face heightened anxiety and sleep deprivation, which causes dramatic mood swings and is responsible for an estimated 13 percent of highway deaths. Worse yet, since the software is designed to save companies money, it often limits workers’ hours to fewer than thirty per week, so that they are not eligible for company health insurance. And
43%
Flag icon
Very soon, the roles of the employees appeared to come into focus. Some people were idea generators, the system concluded. On its chart of employees, Cataphora marked idea generators with circles, which were bigger and darker if they produced lots of ideas. Other people were connectors. Like neurons in a distributed network, they transmitted information. The most effective connectors made snippets of words go viral. The system painted those people in dark colors as well.
Saurabh
New org chart?
43%
Flag icon
So the system identifies apparent losers. And a good number of them lost their jobs during the recession. That alone is unjust. But what’s worse is that systems like Cataphora’s receive minimal feedback data. Someone identified as a loser, and subsequently fired, may have found another job and generated a fistful of patents. That data usually isn’t collected. The system has no inkling that it got one person, or even a thousand people, entirely wrong.
65%
Flag icon
Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that’s something only humans can provide. We have to explicitly embed better values into our algorithms, creating Big Data models that follow our ethical lead. Sometimes that will mean putting fairness ahead of profit.
67%
Flag icon
Mathematical models should be our tools, not our masters.