More on this book
Community
Kindle Notes & Highlights
by
Cathy O'Neil
computer program could speed through thousands of résumés or loan applications in a second or two and sort them into neat lists, with the most promising candidates on top.
punish the poor and the oppressed in our society, while making the rich richer.
The privileged, we’ll see time and again, are processed more by people, the masses by machines.
It’s an algorithm, they were told. It’s very complex. This discouraged many from pressing further.
Conditions change, and so must the model.
the model takes what we know and uses it to predict responses in various situations.
The updates and adjustments make it what statisticians call a “dynamic model.”
When we ask Google Maps for directions, it models the world as a series of roads, tunnels, and bridges. It ignores the buildings, because they aren’t relevant to the task.
Racism, at the individual level, can be seen as a predictive model whirring away in billions of human minds around the world. It is built from faulty, incomplete, or generalized data. Whether it comes from experience or hearsay, the data indicates that certain types of people have behaved badly. That generates a binary prediction that all people of that race will behave that same way.
If a bank’s model of a high-risk borrower, for example, is applied to you, the world will treat you as just that, a deadbeat—even if you’re horribly misunderstood. And when that model scales, as the credit model has, it affects your whole life—whether you can get an apartment or a job or a car to get from one to the other.

