Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
Rate it:
Open Preview
52%
Flag icon
The new industry gave people, for the first time, the chance to pool their collective risk, protecting individuals when misfortune struck.
Chris Aldrich
Pool collectivey being the keys.
53%
Flag icon
And in Florida, adults with clean driving records and poor credit scores paid an average of $1,552 more than the same drivers with excellent credit and a drunk driving conviction.
Chris Aldrich
Holy shit!
53%
Flag icon
According to a watchdog group, the Consumer Federation of America, Allstate analyzes consumer and demographic data to determine the likelihood that customers will shop for lower prices. If they aren’t likely to, it makes sense to charge them more. And that’s just what Allstate does.
55%
Flag icon
In 1943, at the height of World War II, when the American armies and industries needed every troop or worker they could find, the Internal Revenue Service tweaked the tax code, granting tax-free status to employer-based health insurance.
56%
Flag icon
At the center of the weight issue is a discredited statistic, the body mass index. This is based on a formula devised two centuries ago by a Belgian mathematician, Lambert Adolphe Jacques Quetelet,
56%
Flag icon
Averages measure entire populations and often don’t apply to individuals.”
57%
Flag icon
(The one area where wellness programs do show positive results is in quitting smoking.)
57%
Flag icon
wellness programs, despite well-publicized individual successes, often don’t lead to lower health care spending.
57%
Flag icon
In fact, the greatest savings from wellness programs come from the penalties assessed on the workers. In other words, like scheduling algorithms, they provide corporations with yet another tool to raid their employees’ paychecks.
57%
Flag icon
And if companies cooked up their own health and productivity models, this could grow into a full-fledged WMD.
57%
Flag icon
As soon as I hit send, that petition belongs to Facebook, and the social network’s algorithm makes a judgment about how to best use it.
Chris Aldrich
IndieWeb
57%
Flag icon
By tweaking its algorithm and molding the news we see, can Facebook game the political system?
58%
Flag icon
The Facebook campaign started out with a constructive and seemingly innocent goal: to encourage people to vote.
58%
Flag icon
But when the New York Times or CNN covers a story, everyone sees it. Their editorial decision is clear, on the record. It is not opaque. And people later debate (often on Facebook) whether that decision was the right one.
59%
Flag icon
In 2012, researchers experimented on 680,000 Facebook users to see if the updates in their news feeds could affect their mood.
59%
Flag icon
Its platform is massive, powerful, and opaque. The algorithms are hidden from us, and we see only the results of the experiments researchers choose to publish.
59%
Flag icon
So companies like Google would be risking their own reputation, and inviting a regulatory crackdown, if they doctored results to favor one political outcome over another. Then again, how would anyone know? What we learn about these Internet giants comes mostly from the tiny proportion of their research that they share. Their algorithms represent vital trade secrets. They carry out their business in the dark.
59%
Flag icon
Romney’s targeting, it turned out, was inexact. The caterers circulating among the donors, serving drinks and canapés, were outsiders. And like nearly everyone in the developed world, they carried phones equipped with video cameras. Romney’s dismissive remarks, captured by a bartender, went viral.
60%
Flag icon
Direct mail was microtargeting on training wheels. The convergence of Big Data and consumer marketing now provides politicians with far more powerful tools. They can target microgroups of citizens for both votes and money and appeal to each of them with a meticulously honed message, one that no one else is likely to see. It might be a banner on Facebook or a fund-raising email. But each one allows candidates to quietly sell multiple versions of themselves—and it’s anyone’s guess which version will show up for work after inauguration.
60%
Flag icon
Ghani’s science translated perfectly into politics. Those fickle shoppers who switched brands to save a few cents, for example, behaved very much like swing voters. In the supermarket, it was possible to estimate how much it would cost to turn each shopper from one brand of ketchup or coffee to another more profitable brand. The supermarket could then pick out, say, the 15 percent most likely to switch and provide them with coupons. Smart targeting was essential. They certainly didn’t want to give coupons to shoppers who were ready to pay full price. That was like burning money.
61%
Flag icon
we can think of the voting public very much as we think of financial markets. With the flow of information, values rise and fall, as do investments.
62%
Flag icon
But with microtargeting, the focus shifts from the region to the individual. More important, that individual alone sees the customized version of the politician.
62%
Flag icon
The political marketers maintain deep dossiers on us, feed us a trickle of information, and measure how we respond to it. But we’re kept in the dark about what our neighbors are being fed. This resembles a common tactic used by business negotiators. They deal with different parties separately so that none of them knows what the other is hearing. This asymmetry of information prevents the various parties from joining forces—which is precisely the point of a democratic government.
63%
Flag icon
Change that objective from leeching off people to helping them, and a WMD is disarmed—and can even become a force for good.
63%
Flag icon
we’ve visited school and college, the courts and the workplace, even the voting booth.
63%
Flag icon
Our national motto, E Pluribus Unum, means “Out of Many, One.” But WMDs reverse the equation. Working in darkness, they carve one into many, while hiding us from the harms they inflict upon our neighbors near and far.
64%
Flag icon
We cannot count on the free market itself to right these wrongs.
65%
Flag icon
Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that’s something only humans can provide. We have to explicitly embed better values into our algorithms, creating Big Data models that follow our ethical lead. Sometimes that will mean putting fairness ahead of profit.
66%
Flag icon
But we need to impose human values on these systems, even at the cost of efficiency.
66%
Flag icon
Mathematical models should be our tools, not our masters.
66%
Flag icon
The achievement gap, mass incarceration, and voter apathy are big, nationwide problems that no free market nor mathematical algorithm will fix.
67%
Flag icon
equal before the law, or be treated equally as voters, we cannot stand for systems that drop us into different castes and treat us differently.
67%
Flag icon
researchers have launched the Web Transparency and Accountability Project. They create software robots that masquerade online as people of all stripes—rich, poor, male, female, or suffering from mental health issues. By studying the treatment these robots receive, the academics can detect biases in automated systems from search engines to job placement sites.
67%
Flag icon
Facebook also must be accountable to all of us—which means opening its platform to more data auditors.
69%
Flag icon
Another model for the common good has emerged in the field of social work. It’s a predictive model that pinpoints households where children are most likely to suffer abuse.
69%
Flag icon
Yet if the goal is not to punish the parents, but instead to provide help to children who might need it, a potential WMD turns benign.
Chris Aldrich
Like tanteks goal to focus on the positive
69%
Flag icon
they must also deliver transparency, disclosing the input data they’re using as well as the results of their targeting. And they must be open to audits.
69%
Flag icon
My hope is that they’ll be remembered, like the deadly coal mines of a century ago, as relics of the early days of this new revolution, before we learned how to bring fairness and accountability to the age of data.
70%
Flag icon
what policy could Facebook adopt that would be seen as universally fair? It wouldn’t be an easy question to answer even if Facebook were dealing with its influence and responsibility openly and publicly (which it’s decidedly not).
71%
Flag icon
Generally speaking, the job of algorithmic accountability should start with the companies that develop and deploy the algorithms. They should accept responsibility for their influence and develop evidence that what they’re doing isn’t causing harm,
71%
Flag icon
the burden of proof rests on companies, which should be required to audit their algorithms regularly for legality, fairness, and accuracy.
72%
Flag icon
Examples like this demonstrate how critical an issue accountability standards are becoming. When algorithmic systems like the Midas system contain fatal flaws, whether intentional or not, they end up being worse than the human systems they’ve replaced.
73%
Flag icon
mammoth companies like Google, Amazon, and Facebook exert incredible control over society because they control the data. They reap enormous profits while somehow offloading fact-checking responsibilities to others.
1 2 4 Next »