What do you think?
Rate this book
259 pages, Paperback
First published September 6, 2016
"Data is not going away. Nor are computers—much less mathematics. Predictive models are, increasingly, the tools we will be relying on to run our institutions, deploy our resources, and manage our lives. But as I’ve tried to show throughout this book, these models are constructed not just from data but from the choices we make about which data to pay attention to—and which to leave out. Those choices are not just about logistics, profits, and efficiency. They are fundamentally moral."Exactly. We still have to use our brains, not just our computers. It is critical that we inject morality into the process or it will always be fundamentally unfair in some way or another, especially if the intent is to increase profits for one entity at the expense of another. One simply can’t include enough variables or specifics. Some universities have begun to audit the algorithms—like Princeton’s Transparency and Accountability Project—by masquerading as people of differing backgrounds and seeing what kind of treatment these individuals receive from online marketers.
Big data has plenty of evangelists, but I'm not one of them. This book will focus sharply in the other direction, on the damage inflicted by WMDs and the injustice they perpetuate. We will explore harmful examples that effect people at critical life moments, going to college, borrowing money, getting sentenced to prison, finding and holding a job. All these life domains are increasingly controlled by secret models wielding arbitrary punishments. Welcome to the dark side of big data.In the following the author defines some of the shortcomings she observed in how big data was being used.
More and more I was worried about the separation between technical models and real people and about the moral repercussions of that separation. In fact I saw the same pattern emerging that I had witnessed in finance, a false sense of security was leading to widespread use of imperfect models, self-serving definitions of success, and growing feedback loops. Those who objected were regarded as nostalgic Luddites.The author worked at a hedge fund during the 2008 financial collapse. Thus when she moved into the field of consumer data modeling she looked for flaws in the use of data that were similar to what led to the credit crisis.
I wondered what the analog to the credit crisis might be in big data. Instead of a bust I saw a growing dystopia with inequality rising. The algorithms would make sure that those deemed losers would remain that way. A lucky minority would gain evermore control over the data economy raking in outrageous fortunes and convincing themselves all the while that they deserved it.The following is the author's summary near the end of the book.
In this march through a virtual lifetime we've visited school and college, courts and the work place, even the voting booth. Along the way we have witnessed the destruction caused by WMDs. Promising efficiency and fairness, they distort higher education, drive up debt, spur mass incarceration, pummel the poor at nearly every juncture, and undermine democracy.The author, Cathy O'Neil is highly qualified (she got her Ph.D. in math at Harvard), had work experience from inside the system (a “quant” at D.E. Shaw—a major hedge fund), and has evolved into a "Occupy Wall Street" activist. Thus she has experience and qualifications needed to explain how big data effects our lives. It's needed information.
Cathy O'Neil writes well and provides some good information, but I found the fundamental thesis of Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy to be foolish and irresponsible. The title, condensed inevitably to WMD, is used promiscuously to describe any application of data science, statistics, or even information technology that she considers to be unfair or discriminatory. In some cases, she may be right; some applications are used by the greedy and the uncaring with insufficient thought of the consequences, but she rarely considers the positive attributes of the applications. Once branded as a WMD, a term indelibly appended to the evil trifecta of George W. Bush, Saddam Hussein, and the Iraq War, how are readers meant to consider their many positive aspects and what can be done to make them better?
At first, I thought that Ms. O'Neil had been the victim of a foolish PR person in her publishing company who had come up with this catchy name, but her ardent use of the term convinced me that she had invented it.
Ms. O'Neil's own career seemed to follow a downward trajectory from her Harvard PhD in mathematics and a teaching position at Barnard to a leading hedge fund and a series of other ethically questionable environments that left her convinced that data science, and even mathematics, are roots of much evil. I don't buy it. They are tools that can be used or misused, and her diatribe does nothing to help.She asserts, on page 210:
I disagree. Some data is better than none (provided it is accurate and not intentionally manipulative), though we should be looking to constantly improve our data science systems. I would like to know that the recently paroled criminal moving in next door had been released by a judge in full possession of whatever facts are available. So, I suspect, would Ms. O'Neil.
If we find (as studies have already shown) that the recidivism models codify prejudice and penalize the poor, then it's time to have a look at the inputs. In this case, they include loads of birds-of-a-feather connections. They predict an individual's behavior on the basis of the people he knows, his job, and his credit rating—details that would be inadmissible in court. The fairness fix is to throw out that data.
But wait, many would say. Are we going to sacrifice the accuracy of the model for fairness? Do we have to dumb down our algorithms?
In some cases, yes. If we're going to be equal before the law, or be treated equally as voters, we cannot stand for systems that drop us into different castes and treat us differently.