More on this book
Community
Kindle Notes & Highlights
Read between
December 13 - December 24, 2024
this hunger for an expert who knows things with certainty, even when certainty is not possible—has a talent for hanging around.
In a job interview with McKinsey, they told him that he was not certain enough in his opinions. “And I said it was because I wasn’t certain. And they said, ‘We’re billing clients five hundred grand a year, so you have to be sure of what you are saying.’” The consulting firm that eventually hired him was forever asking him to exhibit confidence when, in his view, confidence was a sign of fraudulence. They’d asked him to forecast the price of oil for clients, for instance. “And then we would go to our clients and tell them we could predict the price of oil. No one can predict the price of oil.
...more
That didn’t mean you gave up trying to find an answer; you just couched that answer in probabilistic terms.
an understanding of how hard it is to know anything for sure.
I just think there’s a lot of evidence that gut instincts aren’t very good.”
A funny thing happened when you forced people to cross racial lines in their minds: They ceased to see analogies. Their minds resisted the leap. “You just don’t see it,” said Morey.
Simply knowing about a bias wasn’t sufficient to overcome it:
people to look at some outcome and assume it was predictable all along.
Here there were no German soldiers, only the Milice—the paramilitary force collaborating with the Germans to help them round up Jews and exterminate the French Resistance.
In the story the donkey can’t decide which bundle of hay is closer to him, and so dies of hunger. “Leibowitz would then say that no donkey would do this; a donkey would just go at random to one or the other and eat. It’s only when decisions are made by people that they get more complicated. And then he said, ‘What happens to a country when a donkey makes the decisions that people are supposed to make you can read every day in the paper.’
The WASPs marched around in white lab coats carrying clipboards and thinking up new ways to torture rats and all the while avoided the great wet mess of human experience. The Jews embraced the mess—even the Jews who disdained Freud’s methods and longed for “objectivity” and wished to search for the kinds of truth that might be tested according to the rules of science.
As the sun rose, what he saw instead was an enemy soldier, on a hill, with his back to him: He’d invaded Jordan. (“I nearly started a war.”) The stretch of land beneath the hill in front of them, he saw, was ideally suited for Jordanian snipers looking to pick off Israeli soldiers. Danny turned to sneak his patrol back into Israel, but then he noticed that one of his men was missing his pack. Imagining the dressing-down he’d receive for leaving a pack in Jordan, he and his men crept around the fringes of the kill zone. “It was incredibly dangerous. I knew how stupid it was. But we would stay
...more
Later, when he was a university professor, Danny would tell students, “When someone says something, don’t ask yourself if it is true. Ask what it might be true of.” That was his intellectual instinct, his natural first step to the mental hoop: to take whatever someone had just said to him and try not to tear it down but to make sense of it.
Like a lot of the Jewish kids in Haifa in the early 1950s, Amos joined a leftist youth movement called the Nahal. He was soon elected a leader. The Nahal—the word was an acronym for the Hebrew phrase meaning “Fighting Pioneer Youth”—was a vehicle to move young Zionists from school onto kibbutzim. The idea was that they would serve as soldiers and guard the farm for a couple of years and then become farmers.
Or how his soldiers, even in combat, refused to wear their helmets, claiming that the weather was too hot for them and “if a bullet is going to kill me, it has my name on it anyway.” (To which Amos said, “What about all those bullets addressed ‘To Whom It May Concern’?”)
how Americans believed tomorrow will be better than today, while Israelis were sure tomorrow would be worse;
“You know, Murray, there is no one in the world who is as smart as you think you are.”
Once, after Amos gave a talk, an English statistician had approached him. “I don’t usually like Jews but I like you,” the statistician said. Amos replied, “I usually like Englishmen but I don’t like you.”
There was the time that Amos walked into an Ann Arbor diner and ordered a hamburger with relish. The waiter said they didn’t have relish. Okay, Amos said, I’ll have tomato. We don’t have tomato, either, said the waiter. “Can you tell me what else you don’t have?”
The University of Michigan was then, as it is now, home to the world’s largest department of psychology.
“Similarity judgments,” psychologists called them, in a rare example of comprehensible trade jargon. What goes on in the mind when it evaluates how much one thing is like, or not like, another? The process is so fundamental to our existence that we scarcely stop to think about it. “It’s the process that grinds away constantly and generates much of our understanding and response to the world,” says Berkeley psychologist Dacher Keltner. “First of all it’s, how do you categorize things? And that’s everything. Do I sleep with him or not? Do I eat this or not? Do I give to this person or not? Is
...more
Things are grouped together for a reason, but, once they are grouped, their grouping causes them to seem more like each other than they otherwise would. That is, the mere act of classification reinforces stereotypes. If you want to weaken some stereotype, eliminate the classification.
In his first few months back, Amos gave talks about the latest decision-making theories to the generals in the Israeli army and the Israeli Air Force—even though the practical application of the theories was, to put it mildly, unclear. “I’ve never seen a country so concerned with keeping its officials abreast on new developments in academics,” Barbara wrote to her family back home in Michigan.
The pilot who was praised always performed worse the next time out, and the pilot who was criticized always performed better. Danny watched for a bit and then explained to them what was actually going on: The pilot who was praised because he had flown exceptionally well, like the pilot who was chastised after he had flown exceptionally badly, simply were regressing to the mean. They’d have tended to perform better (or worse) even if the teacher had said nothing at all. An illusion of the mind tricked teachers—and probably many others—into thinking that their words were less effective when they
...more
“The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information” was a paper, written by Harvard psychologist George Miller, which showed that people had the ability to hold in their short-term memory seven items, more or less. Any attempt to get them to hold more was futile. Miller half-jokingly suggested that the seven deadly sins, the seven seas, the seven days of the week, the seven primary colors, the seven wonders of the world, and several other famous sevens had their origins in this mental truth.
The beauty of the experiment was that there was a correct answer to the question: What is the probability that I am holding the bag of mostly red chips? It was provided by a statistical formula called Bayes’s theorem (after Thomas Bayes, who, strangely, left the formula for others to discover in his papers after his death, in 1761). Bayes’s rule allowed you to calculate the true odds, after each new chip was pulled from it, that the book bag in question was the one with majority white, or majority red, chips. Before any chips had been withdrawn, those odds were 50:50—the bag in your hands was
...more
God after so much time trying to understand Bayes's theorem, this might be the most intuitive explanation I've seen.
Ward Edwards had coined a phrase to describe how human beings responded to new information. They were “conservative Bayesians.”
In Danny’s view, people were not conservative Bayesians. They were not statisticians of any kind. They often leapt from little information to big conclusions. The theory of the mind as some kind of statistician was of course just a metaphor. But the metaphor, to Danny, felt wrong.
When you are a pessimist and the bad thing happens, you live it twice, Amos liked to say. Once when you worry about it, and the second time when it happens.
The experimental psychologist “rarely attributes a deviation of results from expectations to sampling variability because he finds a causal ‘explanation’ for any discrepancy,” wrote Danny and Amos. “Thus, he has little opportunity to recognize sampling variation in action. His belief in the law of small numbers, therefore, will forever remain intact.”
“I thought it was a stroke of genius,” recalled Krantz. “I still think it is one of the most important papers that has ever been written. It was counter to all the work that was being done—which was governed by the idea that you were going to explain human judgment by correcting for some more or less minor error to the Bayesian model. It was exactly contrary to the ideas that I had. Statistics was the way you should think about probabilistic situations, but statistics was not the way people did it. Their subjects were all sophisticated in statistics—and even they got it wrong! Every question
...more
The power of the pull of a small amount of evidence was such that even those who knew they should resist it succumbed.
In the end, to stiffen the buildings, he devised, and installed in each of them, eleven thousand two-and-a-half-foot-long metal shock absorbers. The extra steel likely enabled the buildings to stand for as long as they did after they were struck by commercial airliners, and it allowed some of the fourteen thousand people who escaped to flee before the buildings collapsed.
The researchers then repeated the experiment with clinical psychologists and psychiatrists, who gave them the list of factors they considered when deciding whether it was safe to release a patient from a psychiatric hospital. Once again, the experts were all over the map. Even more bizarrely, those with the least training (graduate students) were just as accurate as the fully trained ones (paid pros) in their predictions about what any given psychiatric patient would get up to if you let him out the door. Experience appeared to be of little value in judging, say, whether a person was at risk
...more
“The clinician is not a machine,” he wrote. “While he possesses his full share of human learning and hypothesis-generating skills, he lacks the machine’s reliability. He ‘has his days’: Boredom, fatigue, illness, situational and interpersonal distractions all plague him, with the result that his repeated judgments of the exact same stimulus configuration are not identical. . . . If we could remove some of this human unreliability by eliminating this random error in his judgments, we should thereby increase the validity of the resulting predictions . . .”
The way the creative process works is that you first say something, and later, sometimes years later, you understand what you said.
“They wrote together sitting right next to each other at the typewriter,” recalls Michigan psychologist Richard Nisbett. “I cannot imagine. It would be like having someone else brush my teeth for me.” The way Danny put it was, “We were sharing a mind.”
And when people calculate the odds in any life situation, they are often making judgments about similarity—or (strange new word!) representativeness. You have some notion of a parent population: “storm clouds” or “gastric ulcers” or “genocidal dictators” or “NBA players.” You compare the specific case to the parent population.
What is more, if you asked the same Israeli kids to choose the more likely birth order in families with six children—B B B G G G or G B B G B G—they overwhelmingly opted for the latter. But the two birth orders are equally likely. So why did people almost universally believe that one was far more likely than the other? Because, said Danny and Amos, people thought of birth order as a random process, and the second sequence looks more “random” than the first.
We have a kind of stereotype of “randomness” that differs from true randomness. Our stereotype of randomness lacks the clusters and patterns that occur in true random sequences.
The more easily people can call some scenario to mind—the more available it is to them—the more probable they find it to be.
Any fact or incident that was especially vivid, or recent, or common—or anything that happened to preoccupy a person—was likely to be recalled with special ease, and so be disproportionately weighted in any judgment.
The unsuspecting Oregon students, having listened to a list, were then asked to judge if it contained the names of more men or more women. They almost always got it backward: If the list had more male names on it, but the women’s names were famous, they thought the list contained more female names, and vice versa.
Nevertheless, the availability heuristic may be applied to evaluate the likelihood of such events. “In judging the likelihood that a particular couple will be divorced, for example, one may scan one’s memory for similar couples which this question brings to mind. Divorces will appear probable if divorces are prevalent among the instances that are retrieved in this manner.”
“Consequently,” Amos and Danny wrote, “the use of the availability heuristic leads to systematic biases.” Human judgment was distorted by . . . the memorable.
Danny and Amos asked their subjects to spin a wheel of fortune with slots on it that were numbered 0 through 100. Then they asked the subjects to estimate the percentage of African countries in the United Nations. The people who spun a higher number on the wheel tended to guess that a higher percentage of the United Nations consisted of African countries than did those for whom the needle landed on a lower number. What was going on here? Was anchoring a heuristic, the way that representativeness and availability were heuristics? Was it a shortcut that people used, in effect, to answer to their
...more
The stories we make up, rooted in our memories, effectively replace probability judgments.
The stories people told themselves, when the odds were either unknown or unknowable, were naturally too simple.
It’s far easier for a Jew living in Paris in 1939 to construct a story about how the German army will behave much as it had in 1919, for instance, than to invent a story in which it behaves as it did in 1941, no matter how persuasive the evidence might be that, this time, things are different.