More on this book
Community
Kindle Notes & Highlights
by
Cathy O'Neil
powerful discipline of applied mathematics called “operations research,”
usually people who badly need money. And because they need money so desperately, the companies can bend their lives to the dictates of a mathematical model.
take steps not to make people’s lives too miserable.
face heightened anxiety and sleep deprivation, which causes dramatic mood swings and is responsible for an estimated 13 percent of highway deaths.
But computing systems have trouble finding digital proxies for these kinds of soft skills. The relevant data simply isn’t collected, and anyway it’s hard to put a value on them. They’re usually easier to leave out of a model.
The 90 percent difference in scores only made me realize how ridiculous the entire value-added model is when it comes to education.”
Nation at Risk report. It
A Nation at Risk was published with such fanfare, researchers at Sandia National Laboratories took a second look at the data gathered for the report.
Statistically speaking, in these attempts to free the tests from class and color, the administrators moved from a primary to a secondary model.
Once people recognize them and understand their statistical flaws, they’ll demand evaluations that are fairer for both students and teachers. However, if the goal of the testing is to find someone to blame, and to intimidate workers, then, as we’ve seen, a WMD that spews out meaningless scores gets an A-plus.
LOOK AT TESTING AND SCORES FOR GERMANS AND EUROPEAN NATIONS UNDER HITLER WHEN HE FOCUSED KIDS LIKE BOYS TO ATHLETICS FIRST THEN TO EDUCATION PHYSICAL DEVELOPMENT THEN TO SUBJECT MATTER
Many of their pseudoscientific models attempt to predict our creditworthiness, giving each of us so-called e-scores. These numbers, which we rarely see, open doors for some of us, while slamming them in the face of others. Unlike the FICO scores they resemble, e-scores are arbitrary, unaccountable, unregulated, and often unfair—in short, they’re WMDs.
But they sell and provide guidance to the consumer on how to be successful in the market place. Let’s open a new Equifax, Experience service these services also control the market and can ruin it. See hosing crash and score manipulation
called Neustar offers a prime example. Neustar provides customer targeting services
2003 Taurus on Carfax.com. Most scoring systems also pick up the location of the visitor’s computer. When this is matched with real estate data, they can draw inferences about wealth. A person using a computer on San Francisco’s Balboa Terrace is a far better prospect than the
into largely unregulated pools of data, such as clickstreams and geo-tags, in order to create a parallel data marketplace. In the process, they can largely avoid government oversight. They then measure success by gains in efficiency, cash flow, and profits. With few exceptions, concepts like justice and transparency don’t fit into their algorithms.
Consider this. As of 2015, white households held on average roughly ten times as much money and property as black and Hispanic households. And while only 15 percent of whites had zero or negative net worth, more than a third of blacks and Hispanic households found themselves with no cushion. This wealth gap increases with age. By their sixties, whites are eleven times richer than African Americans. Given these
(Wealthy travelers, by contrast, are often able to pay to acquire “trusted traveler” status, which permits them to waltz through security. In effect, they’re spending money to shield themselves from a WMD.)
Acxiom Corp.
profiles, while expanding the risk for errors. Recently, Google processed images of a trio of happy young African Americans and its automatic photo-tagging service labeled them as gorillas. The company apologized profusely, but in systems like Google’s, errors are inevitable. It was most likely faulty machine learning (and probably not a racist running loose in the Googleplex) that led the computer to confuse Homo
AI database looking at bigger picture of all verses just picture. Now white women with black males... purpose social manipulation or AI adjustments. White males?
Wisconsin Department of Insurance, the CFA listed one hundred thousand microsegments in Allstate’s pricing schemes.
This comes down especially hard on black women, who often have high BMIs. But isn’t it a good thing, wellness advocates will ask, to help people deal with their weight and other health issues?
scale. Within hours, Facebook could harvest information from tens of millions of people, or more, measuring the impact that their words and shared links had on each other. And it could use that knowledge to influence people’s actions, which in this case happened to be voting.
Like doctors, data scientists should pledge a Hippocratic Oath, one that focuses on the possible misuses and misinterpretations of their models. Following the market crash of 2008, two financial engineers, Emanuel Derman and Paul Wilmott, drew up such an oath. It reads: ~ I will remember that I didn’t make the world, and it doesn’t satisfy my equations. ~ Though I will use models boldly to estimate value, I will not be overly impressed by mathematics. ~ I will never sacrifice reality for elegance without explaining why I have done so. ~ Nor will I give the people who use my model false comfort
...more
The Americans with Disabilities Act (ADA), which protects people with medical issues from being discriminated against at work, also needs an update. The bill currently prohibits medical exams as part of an employment screening. But we need to update it to take into account Big Data personality tests, health scores, and reputation scores. They all sneak around the law, and they shouldn’t be able to. One possibility already under discussion would extend protection of the ADA to include “predicted” health outcomes down the road. In other words, if a
For example, between 2013 and 2015 Michigan’s unemployment agency deployed a destructive and greedy algorithm, ironically called Midas.