Freeman “Dyson Sphere” Dyson wrote the New York Times review, which has me swooning right there. He was a particularly apt pick because Kahneman helped design the Israeli military screening and training systems back when the country was young, and Dyson at 20 years old cranked statistics for the British Bombing Command in its youth. He was part of a small group that figured out the bombers were wrong about what mattered to surviving night time raids over Germany, a thing only about a quarter of the crews did over a tour. But no data driven changes were made because “the illusion of validity does not disappear just because facts prove it to be false. Everyone at Bomber Command, from the commander in chief to the flying crews, continued to believe in the illusion. The crews continued to die, experienced and inexperienced alike, until Germany was overrun and the war finally ended.” http://www.nybooks.com/articles/archi...
Why did the British military resist the changes? Because it was deeply inconsistent the heroic story of the RAF they believed in. Suppose there are stories I’d die for too. Kahneman got the Nobel Prize for Economics for showing that the story that humans can usefully be abstracted as the Rational Man of Economics was based on a fundamental misunderstanding of decision making. We are not evolved to be rational wealth maximizers, and we systematically value and fear some things that should not be valued so highly or feared so much if we really were the Homo Economicus the Austrian School seems to think we should be. Which is personally deeply satisfying, because I never bought it and deeply unsettling because of how many decisions are made based on that vision.
If that was all this book was, it’d just be another in a mass of books that have as their thesis “You’re wrong about that!” Which I appreciate knowing, but there’s a point where it’s a little eye rolling because they don’t offer any helpful suggestions on how not to be wrong, or why these patterns of wrongness exist and endure. But Kahneman has a theory. He theorizes that humans have two largely separate decision-making systems: System One (the fast) and System Two (the slow). System One lets us survive monster attacks and have meaningful relationships with each other. System Two lets us get to the moon.
Both systems have values built into them and any system of decision-making that edits them out is doomed to undercut itself. Some specifics that struck me:
Ideomotor Effect: (53) Concepts live in our heads in associative networks. Once triggered, they cascade concepts. Make someone walk slow, they think about old age. Make someone smile, and they’ll be happier. Seeing a picture of cash makes us more independent, more selfish, and less likely to pick up something someone else has dropped. Seeing a locker makes us more likely to vote for school bonds. Reminding people of their mortality makes them more receptive of authoritarian ideas.” (56) “Studies of priming effects have yielded discoveries that threaten our self-image as conscious and autonomous authors of our judgments and our choices.” (55).
Halo Effect (82) “If you like the president’s politics, you probably like his voice and appearance as well.” We find someone attractive and we conclude they’re competent. We find emotional coherence pleasing and lack of coherence frustrating. However, far fewer things are correlated than we believe.
What You See Is All There Is (WYSIATI) (85). Our system one is pattern seeking. Our system 2 is lazy; happen to endorse system 1 beliefs without doing the math. “Jumping to conclusions on the basis of limited evidence is so important to an understanding of intuitive thinking, and comes up so often in this book, that I will use a cumbersome abbreviation for it: WYSIATI. . . System 1 is radically insensitive to both the quality and quantity of information that gives rise to impressions and intuitions.” (86). Absolutely essentially for not getting eaten by lurking monsters, and “explains why we can think fast, and how we are able to make sense of partial information in a complex world. Much of the time, the coherent story we put together is close enough to reality to support reasonable action.” Except when it doesn’t. Like in our comparative risk assessments. We panic about shark attacks and fail to fear riptides; freak out about novel and unusual risks and opportunities and undervalue the pervasive.
Answering an Easier Question (97). If one question is hard, we’ll substitute an easier one. It can be a good way to make decisions. Unless the easier question is not a good substitute. I have an uneasy awareness that I do this. Especially since it often REALLY ANNOYS me when people do it to me.
The Law of Small Numbers. (109) The counties with the lowest level of kidney cancer are rural, sparsely populated, and located in traditionally Republican states. Why? Good clean living? The counties with the highest level of kidney cancer are rural, sparsely populated, and located in traditionally Republican states. Why? Lack of access to health care? Wait, what? The System 1 mind immediately comes up with a story to explain the difference. But once the numbers are cranked, apparently, it’s just an artifact of the fact that a few cases in a small county skews the rate. But if you base your decision on either story, the outcomes will be bad.
Anchors (119). We seize on the first value offered, no matter how obviously absurd it is. If you want to push someone in a direction, get them to accept the anchor you want them too.
Regression to the Mean. (175) There will be random fluctuations in the quality of performance. A teacher who praises a randomly good performance may shape behavior, but likely will simply be disappointed as statistics asserts itself and a bad performance follows. A teacher who criticizes a bad performance may incentivize, but likely will simply have a false sense of causation when statistics asserts itself and a good performance happens. Kahneman describes it as “a significant fact of the human condition: the feedback to which life exposes us too is perverse. Because we tend to be nice to other people when they please us and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty.” (176).
The Illusion of Understanding (204) The sense-making machinery of System 1 makes us see the world as more tidy, simple, predictable, and coherent than it really is. The illusion that one has understood the past feeds the further illusion that one can control the future. These illusions are comforting. They reduce the anxiety that we would experience if we allowed ourselves to fully acknowledge the uncertainties of existence. We all have a need for the reassuring message that actions have appropriate consequences, and that success will reward wisdom and courage.” But it doesn’t . (212). For example, we’re totally wrong about whether you can beat the stock market.
Formulas are often much more predictive than learned intuition. I’m going to have to wrestle with this one, but he alluded to a claim by Robyn Dawes that “marital stability is well predicted by a formula: frequency of lovemaking minus frequency of quarrels.” (226) Snicker.
Premortems Can Help. (264) before making a decision, assign someone to imagine it’s a year into the future and the plan was a disaster. Have them write a history of the disaster.
We value losses more than gains. (349) Which is fine except when that means we expose others to more risk because we did the math wrong.
The Focusing Illusion (402) “Nothing in life is as important as you think it is when you are thinking about it.” We overvalue what’s in our mind at the moment, which is subject to priming.
He closes by stressing he does not mean to say that people are irrational. But, he says, “rational” in economic terms has a particular meaning that does not describe people. “For economists and decision theorists, [rationality] has an altogether different meaning. The only test of rationality is not whether a person’s beliefs and preferences are reasonable, but whether they are internally consistent. A rational person can believe in ghosts, so long as all her other beliefs are consistent with the existence of ghosts. . . . Rationality is logical coherence – reasonable or not. Econs are rational by this definition, but there is overwhelming evidence that Humans cannot be. . . .
“The definition of rationality as coherence is impossibly restrictive; it demands adherence to rules of logic that a finite mind is not able to implement. Reasonable people cannot be rational by that definition, but they should not be branded as irrational for that reason. Irrational is a strong word, which connotes impulsivity, emotionality, and a stubborn resistance to reasoned argument. I often cringe when my work with Amos is credited with demonstrating that human choices are irrational, when in fact our research only showed that Humans are not well described by the rational-agent model.” (411)
A good read.