More on this book
Community
Kindle Notes & Highlights
From his stint as a consultant he learned something valuable, however. It seemed to him that a big part of a consultant’s job was to feign total certainty about uncertain things. In a job interview with McKinsey, they told him that he was not certain enough in his opinions. “And I said it was because I wasn’t certain. And they said, ‘We’re billing clients five hundred grand a year, so you have to be sure of what you are saying.’ ” The consulting firm that eventually hired him was forever asking him to exhibit confidence when, in his view, confidence was a sign of fraudulence. They’d asked him
...more
People who didn’t know Daryl Morey assumed that because he had set out to intellectualize basketball he must also be a know-it-all. In his approach to the world he was exactly the opposite. He had a diffidence about him—an understanding of how hard it is to know anything for sure. The closest he came to certainty was in his approach to making decisions. He never simply went with his first thought. He suggested a new definition of the nerd: a person who knows his own mind well enough to mistrust it.
The last psychologist who showed up claiming to be able to predict behavior had essentially used the Myers-Briggs personality test—and then tried to persuade Morey, after the fact, that he had warded off all manner of unseen problems. The way he’d gone on reminded Daryl Morey of a joke. “The guy walks around with a banana in his ear. And people are like, ‘Why do you have a banana in your ear?’ He says, ‘To keep the alligators away! There are no alligators! See?’
Later, when he was a university professor, Danny would tell students, “When someone says something, don’t ask yourself if it is true. Ask what it might be true of.” That was his intellectual instinct, his natural first step to the mental hoop: to take whatever someone had just said to him and try not to tear it down but to make sense of it.
The University of Michigan psychologist Dick Nisbett, after he’d met Amos, designed a one-line intelligence test: The sooner you figure out that Amos is smarter than you are, the smarter you are.
When the new mail came in Amos opened anything that interested him and left the rest in its daily pile. Each day the new mail arrived and shoved the old mail down the table. When a pile reached the end of the table Amos pushed it, unopened, off the edge into a waiting garbage can. “The nice thing about things that are urgent,” he liked to say, “is that if you wait long enough they aren’t urgent anymore.” “I would say to Amos I have to do this or I have to do that,” recalled his old friend Yeshu Kolodny. “And he would say, ‘No. You don’t.’ And I thought: lucky man!”
The pilot who was praised because he had flown exceptionally well, like the pilot who was chastised after he had flown exceptionally badly, simply were regressing to the mean. They’d have tended to perform better (or worse) even if the teacher had said nothing at all. An illusion of the mind tricked teachers—and probably many others—into thinking that their words were less effective when they gave pleasure than when they gave pain.
The more similar the specific case is to the notion in your head, the more likely you are to believe that the case belongs to the larger group. “Our thesis,” they wrote, “is that, in many situations, an event A is judged to be more probable than an event B whenever A appears more representative than B.” The more the basketball player resembles your mental model of an NBA player, the more likely you will think him to be an NBA player.
For instance, in families with six children, the birth order B G B B B B was about as likely as G B G B B G. But Israeli kids—like pretty much everyone else on the planet, it would emerge—naturally seemed to believe that G B G B B G was a more likely birth sequence. Why? “The sequence with five boys and one girl fails to reflect the proportion of boys and girls in the population,” they explained. It was less representative.
Londoners in the Second World War thought that German bombs were targeted, because some parts of the city were hit repeatedly while others were not hit at all. (Statisticians later showed that the distribution was exactly what you would expect from random bombing.) People find it a remarkable coincidence when two students in the same classroom share a birthday, when in fact there is a better than even chance, in any group of twenty-three people, that two of its members will have been born on the same day. We have a kind of stereotype of “randomness” that differs from true randomness. Our
...more
Once again, people’s judgment was, systematically, very wrong. And it was wrong, Danny and Amos now proposed, because it was distorted by memory. It was simply easier to recall words that start with K than to recall words with K as their third letter. The more easily people can call some scenario to mind—the more available it is to them—the more probable they find it to be.
They told their subjects that they had picked a person from a pool of 100 people, 70 of whom were engineers and 30 of whom were lawyers. Then they asked them: What is the likelihood that the selected person is a lawyer? The subjects correctly judged it to be 30 percent. And if you told them that you were doing the same thing, but from a pool that had 70 lawyers in it and 30 engineers, they said, correctly, that there was a 70 percent chance the person you’d plucked from it was a lawyer. But if you told them you had picked not just some nameless person but a guy named Dick, and read them
...more
It wasn’t that what first came to mind was always wrong; it was that its existence in your mind led you to feel more certain than you should be that it was correct. “Beware of the delirious guy in the emergency unit with the long history of alcoholism,” said Redelmeier, “because you will say, ‘He’s just drunk,’ and you’ll miss the subdural hematoma.” The woman’s surgeons had leapt from her medical history to a diagnosis without considering the base rates.
The entire profession had arranged itself as if to confirm the wisdom of its decisions. Whenever a patient recovered, for instance, the doctor typically attributed the recovery to the treatment he had prescribed, without any solid evidence that the treatment was responsible. Just because the patient is better after I treated him doesn’t mean he got better because I treated him, Redelmeier thought. “So many diseases are self-limiting,” he said. “They will cure themselves. People who are in distress seek care. When they seek care, physicians feel the need to do something. You put leeches on; the
...more
“That was the moment I gave up on decision analysis,” said Danny. “No one ever made a decision because of a number. They need a story.” As Danny and Lanir wrote, decades later, after the U.S. Central Intelligence Agency asked them to describe their experience in decision analysis, the Israeli Foreign Ministry was “indifferent to the specific probabilities.” What was the point of laying out the odds of a gamble, if the person taking it either didn’t believe the numbers or didn’t want to know them? The trouble, Danny suspected, was that “the understanding of numbers is so weak that they don’t
...more
They spent more than a year working and reworking the same basic idea: In order to explain the paradoxes that expected utility could not explain, and create a better theory to predict behavior, you had to inject psychology into the theory. By testing how people choose between various sure gains and gains that were merely probable, they traced the contours of regret. Which of the following two gifts do you prefer? Gift A: A lottery ticket that offers a 50 percent chance of winning $1,000 Gift B: A certain $400
People felt greater pleasure going from 0 to $1 million than they felt going from $1 million to $2 million. Of course, expected utility theory also predicted that people would take a sure gain over a bet that offered an expected value of an even bigger gain. They were “risk averse.” But what was this thing that everyone had been calling “risk aversion?” It amounted to a fee that people paid, willingly, to avoid regret: a regret premium. Expected utility theory wasn’t exactly wrong. It simply did not understand itself, to the point where it could not defend itself against seeming
...more
In people’s perceptions of money, as surely as in their perception of light and sound and the weather and everything else under the sun, what mattered was not the absolute levels but changes. People making choices, especially choices between gambles for small sums of money, made them in terms of gains and losses; they weren’t thinking about absolute levels.
One day, toward the end of 1974, as they looked over the gambles they had put to their subjects, Amos asked, “What if we flipped the signs?” Till that point, the gambles had all involved choices between gains. Would you rather have $500 for sure or a 50-50 shot at $1,000? Now Amos asked, “What about losses?” As in: Which of the following do you prefer? Gift A: A lottery ticket that offers a 50 percent chance of losing $1,000 Gift B: A certain loss of $500 It was instantly obvious to them that if you stuck minus signs in front of all these hypothetical gambles and asked people to reconsider
...more
As they sorted through the implications of their new discovery, one thing was instantly clear: Regret had to go, at least as a theory. It might explain why people made seemingly irrational decisions to accept a sure thing over a gamble with a far greater expected value. It could not explain why people facing losses became risk seeking. Anyone who wanted to argue that regret explains why people prefer a certain $500 to an equal chance to get $0 and $1,000 would never be able to explain why, if you simply subtracted $1,000 from all the numbers and turned the sure thing into a $500 loss, people
...more
The first was the realization that people responded to changes rather than absolute levels. The second was the discovery that people approached risk very differently when it involved losses than when it involved gains.
Now they saw that people reacted differently to different degrees of uncertainty. When you gave them one bet with a 90 percent chance of working out and another with a 10 percent chance of working out, they did not behave as if the first was nine times as likely to work out as the second. They made some internal adjustment, and acted as if a 90 percent chance was actually slightly less than a 90 percent chance, and a 10 percent chance was slightly more than a 10 percent chance. They responded to probabilities not just with reason but with emotion. Whatever that emotion was, it became stronger
...more
Their theory explained all sorts of things that expected utility failed to explain. But it implied, as utility theory never had, that it was as easy to get people to take risks as it was to get them to avoid them. All you had to do was present them with a choice that involved a loss. In the more than two hundred years since Bernoulli started the discussion, intellectuals had regarded risk-seeking behavior as a curiosity. If risk seeking was woven into human nature, as Danny and Amos’s theory implied that it was, why hadn’t people noticed it before? The answer, Amos and Danny now thought, was
...more
The two questions were effectively identical. In both cases, if you picked the gamble, you wound up with a 50-50 shot at being worth $2,000. In both cases, if you picked the sure thing, you wound up being worth $1,500. But when you framed the sure thing as a loss, people chose the gamble. When you framed it as a gain, people picked the sure thing. The reference point—the point that enabled you to distinguish between a gain and a loss— wasn’t some fixed number. It was a psychological state. “What constitutes a gain or a loss depends on the representation of the problem and on the context in
...more
This one they called “framing.” Simply by changing the description of a situation, and making a gain seem like a loss, you could cause people to completely flip their attitude toward risk, and turn them from risk avoiding to risk seeking. “We invented framing without realizing we were inventing framing,” said Danny. “You take two things that should be identical—the way they differ should be irrelevant—and by showing it isn’t irrelevant, you show that expected utility theory is wrong.” Framing, to Danny, felt like their work on judgment. Here, look, yet another strange trick the mind played on
...more
The two problems were identical, but, in the first case, when the choice was framed as a gain, the subjects elected to save 200 people for sure (which meant that 400 people would die for sure, though the subjects weren’t thinking of it that way). In the second case, with the choice framed as a loss, they did the reverse, and ran the risk that they’d kill everyone. People did not choose between things. They chose between descriptions of things. Economists, and anyone else who wanted to believe that human beings were rational, could rationalize, or try to rationalize, loss aversion. But how did
...more
Thaler may not have felt all that sure of himself, but he was quick to see that others shouldn’t feel so sure of themselves, either. And he noticed that when he had his fellow economists to dinner, they filled up on cashews, which meant they had less appetite for the meal. More to the point, he noticed that they tended to be relieved when he removed the cashew nuts, so they didn’t ruin their dinners. “The idea that it could make you better off to reduce your choices—that idea was alien to economics,” he said.
Their mind latches onto a story of imminent death and the story masks the logic of the situation. Amos created a lovely example. He asked people: Which is more likely to happen in the next year, that a thousand Americans will die in a flood, or that an earthquake in California will trigger a massive flood that will drown a thousand Americans? People went with the earthquake.
To explore those boundaries, they finally shoved their subjects’ noses right up against logic. They gave subjects the same description of Linda and asked, simply: “Which of the two alternatives is more probable?” Linda is a bank teller. Linda is a bank teller and is active in the feminist movement. Eighty-five percent still insisted that Linda was more likely to be a bank teller in the feminist movement than she was to be a bank teller.
Gigerenzer came to be identified with a strain of thought known as evolutionary psychology, which had in it the notion that the human mind, having adapted to its environment, must be very well suited to it. It certainly wouldn’t be susceptible to systematic biases. Amos found that notion absurd. The mind was more like a coping mechanism than it was a perfectly designed tool. “The brain appears to be programmed, loosely speaking, to provide as much certainty as it can,” he once said, in a talk to a group of Wall Street executives. “It is apparently designed to make the best possible case for a
...more