More on this book
Community
Kindle Notes & Highlights
Read between
December 17, 2016 - September 4, 2017
The only way to understand a mechanism such as the eye, he thought, was by studying the mistakes that it made. Error wasn’t merely instructive; it was the key that might unlock the deep nature of the mechanism. “How do you understand memory?” he asked. “You don’t study memory. You study forgetting.”
The successful fighter pilots were better able to switch attention than the unsuccessful ones, and both were better at it than Israeli bus drivers.
the most effective way to teach people longer strings of information was to feed the information into their minds in smaller chunks.
“Someone once said that education was knowing what to do when you don’t know,”
This is what happens when people become attached to a theory. They fit the evidence to the theory rather than the theory to the evidence. They cease to see what’s right under their nose.
Everywhere one turned, one found idiocies that were commonly accepted as truths only because they were embedded in a theory to which the scientists had yoked their careers.
Danny was always sure he was wrong. Amos was always sure he was right.
Even people trained in statistics and probability theory failed to intuit how much more variable a small sample could be than the general population—and that the smaller the sample, the lower the likelihood that it would mirror the broader population. They assumed that the sample would correct itself until it mirrored the population from which it was drawn.
This failure of human intuition had all sorts of implications for how people moved through the world, and rendered judgments and made decisions,
But because he has misjudged how large the sample needs to be if it is to stand a good chance of reflecting the entire population, he is at the mercy of luck.*
In their search for scientific truth, they were relying far more than they knew on chance. What’s more, because they had so much faith in the power of small samples, they tended to rationalize whatever they found in them.
The psychologists had so much faith in small samples that they assumed that whatever had been learned from either group must be generally true, even if one lesson seemed to contradict the other.
“Edwards . . . has argued that people fail to extract sufficient information or certainty from probabilistic data; he called this failure conservatism. Our respondents can hardly be described as conservative. Rather, in accord with the representation hypothesis, they tend to extract more certainty from the data than the data, in fact, contain.”
The power of the pull of a small amount of evidence was such that even those who knew they should resist it succumbed. People’s “intuitive expectations are governed by a consistent misperception of the world,”
Most spheres of human activity lacked the data to build the algorithms that might replace the human judge.
For most of the thorny problems in life, people would need to rely on the expert judgment of some human being:
they set out to create a model of what experts were doing when they formed their judgments.
he set out to analyze how experts drew their conclusions. Of course you might simply ask the experts how they did it—but that was a highly subjective approach.
A better way to get at expert thinking, Hoffman argued, was to take the various inputs the experts used to make their decisions (“cues,” he called these inputs) and infer from those decisions the weights they had placed on the various inputs.
the small mountain of research that suggested that expert judgment was less reliable than algorithms.
experts tended to describe their
thought processes as subtle and complicated and difficult to model.
The researchers’ goal was to see if they could create an algorithm that would mimic the decision making of doctors.
the simple model that the researchers had created as their starting point for understanding how doctors rendered their diagnoses proved to be extremely good at predicting the doctors’ diagnoses.
More surprisingly, the doctors’ diagnoses were all over the map: The experts didn’t agree with each other. Even more surprisingly, when presented with duplicates of the same ulcer, every doctor had contradicted himself and rendered more than one diagnosis: These doctors apparently could not even agree with themselves.
Even more bizarrely, those with the least training (graduate students) were just as accurate as the fully trained ones (paid pros) in their predictions
What was lacking was “immediate feedback.”
“It now appears that our initial formulation of the problem of learning clinical inference was far too simple—that a good deal more than outcome feedback is necessary for judges to learn a task as difficult as this one,”
You could beat the doctor by replacing him with an equation created by people who knew nothing about medicine and had simply asked a few questions of doctors.
The problem was not what they knew, or didn’t know. It was their need for certainty or, at least, the appearance of certainty.
To acknowledge uncertainty was to admit the possibility of error. The entire profession had arranged itself as if to confirm the wisdom of its decisions.
the doctor typically attributed the recovery to the treatment he had prescribed, without any solid evidence that the treatment was responsible.
As it happens, a movement was taking shape right then and there in Toronto that came to be called “evidence-based medicine.” The core idea of evidence-based medicine was to test the intuition of medical experts—to check the thinking of doctors against hard data. When subjected to scientific investigation, some of what passed for medical wisdom turned out to be shockingly wrong-headed.
in 1980, for instance, the conventional wisdom held that if a heart attack victim suffered from some subsequent arrhythmia, you gave him drugs to suppress it.
researchers had shown that heart attack patients whose arrhythmia was suppressed died more often than the ones...
This highlight has been truncated due to consecutive passage length restrictions.
But it was clear that the intuitive judgments of doctors could be gravely flawed: The evidence of the medical trials now could not be ignored. And Redelmeier was alive to the evidence. “I became very aware of the buried analysis—that a lot of the probabilities were being made up by expert opinion,”
Hal Sox happened to have coauthored the first article Amos ever wrote about medicine.
“The physician is meant to be the perfect agent for the patient as well as the protector of society,” he said. “Physicians deal with patients one at a time, whereas health policy makers deal with aggregates.”
But there was a conflict between the two roles.
A part of good science is to see what everyone else can see but think what no one else has ever said. The difference between being very smart and very foolish is often very small. So many problems occur when people fail to be obedient when they are supposed to be obedient, and fail to be creative when they are supposed to be creative. The secret to doing good research is always to be a little underemployed. You waste years by not being able to waste hours. It is sometimes easier to make the world a better place than to prove you have made the world a better place.
the discipline he imposed became familiar. “He needs
the concrete examples to test his general theories,”
They didn’t simply experience fixed levels of happiness or unhappiness. They experienced one thing and remembered something else.
Their memory of pain was different from their experience of it. They remembered moments of maximum pain, and they remembered, especially, how they felt the moment the pain ended. But they didn’t particularly remember the length of the painful experience.
people preferred to endure more total pain so long as the experience ended on a more pleasant note.
Amos seemed like the practical one, but Danny, more than Amos, had a gift for finding solutions to problems where others failed even to notice that there was a problem to solve.
Danny pointed out that people learn more efficiently in short bursts, and that new tank drivers might be educated faster if the trainees rotated behind the wheel every thirty minutes.
Experts on decision making would sit with leaders in business, the military, and government and help them to frame every decision explicitly as a gamble; to calculate the odds of this or that happening; and to assign values to every possible outcome.
They would learn to evaluate a decision not by its outcomes—whether it turned out to be right or wrong—but by the process that led to it. The job of the decision maker wasn’t to be right but to figure out the odds in any decision and play them well.
what was needed was a “transformation of cultural attitudes to uncertainty and to risk.”