More on this book
Community
Kindle Notes & Highlights
However, a lazy System 2 often follows the path of least effort and endorses a heuristic answer without much scrutiny of whether it is truly appropriate.
The bias associated with the heuristic is that objects that appear to be more distant also appear to be larger on the page.
Any emotionally significant question that alters a person’s mood will have the same effect. WYSIATI. The present state of mind looms very large when people evaluate their happiness.
affect heuristic in which people let their likes and dislikes determine their beliefs about the world.
Your beliefs, and even your emotional attitude, may change (at least a little) when you learn that the risk of an activity you disliked is smaller than you thought. However, the information about lower risks will also change your view of the benefits (for the better) even if nothing was said about benefits in the information you received.
System 1 is highly adept in one form of thinking—it automatically and effortlessly identifies causal connections between events, sometimes even when the connection is spurious.
System 1 is inept when faced with “merely statistical” facts, which change the probability of outcomes but do not cause them to happen.
The explanation I offered is statistical: extreme outcomes (both high and low) are more likely to be found in small than in large samples.
Large samples are more precise than small samples. Small samples yield extreme results more often than large samples do.
Using a sufficiently large sample is the only way to reduce the risk.
“intuitions about random sampling appear to satisfy the law of small numbers, which asserts that the law of large numbers applies to small numbers as well.”
“statistical intuitions with proper suspicion and replace impression formation by computation whenever possible.”
The law of small numbers is a manifestation of a general bias that favors certainty over doubt, which will turn up in many guises in following chapters.
we are prone to exaggerate the consistency and coherence of what we see.
Instead of focusing on how the event at hand came to be, the statistical view relates it to what could have happened instead. Nothing in particular caused it to be what it is—chance selected it from among its alternatives.
predilection
“To the untrained eye,” Feller remarks, “randomness appears as regularity or tendency to cluster.”
The exaggerated faith in small samples is only one example of a more general illusion—we pay more attention to the content of messages than to information about their reliability,
Statistics produce many observations that appear to beg for causal explanations but do not lend themselves to such explanations. Many facts of the world are due to chance, including accidents of sampling.
it is an anchoring effect. It occurs when people consider a particular value for an unknown quantity before estimating that quantity.
people’s judgments were influenced by an obviously uninformative number.
Two different mechanisms produce anchoring effects—one for each system. There is a form of anchoring that occurs in a deliberate process of adjustment, an operation of System 2. And there is anchoring that occurs by a priming effect, an automatic manifestation of System 1.
Amos liked the idea of an adjust-and-anchor heuristic as a strategy for estimating uncertain quantities: start from an anchoring number, assess whether it is too high or too low, and gradually adjust your estimate by mentally “moving” from the anchor.
Insufficient adjustment neatly explains why you are likely to drive too fast when you come off the highway onto city streets—especially if you are talking with someone as you drive. Insufficient adjustment is also a source of tension between exasperated parents and teenagers who enjoy loud music in their room.
System 1 understands sentences by trying to make them true, and the selective activation of compatible thoughts produces a family of systematic errors that make us gullible and prone to believe too strongly whatever we believe.
Powerful anchoring effects are found in decisions that people make about money, such as when they choose how much to contribute to a cause.
Anchoring effects explain why, for example, arbitrary rationing is an effective marketing ploy.
As you may have experienced when negotiating for the first time in a bazaar, the initial anchor has a powerful effect. My advice to students when I taught negotiations was that if you think the other side has made an outrageous proposal, you should not come back with an equally outrageous counteroffer, creating a gap that will be difficult to bridge in further negotiations. Instead you should make a scene, storm out or threaten to do so, and make it clear—to yourself as well as to the other side—that you will not continue the negotiation with that number on the table.
The psychologists Adam Galinsky and Thomas Mussweiler proposed more subtle ways to resist the anchoring effect in negotiations. They instructed negotiators to focus their attention and search their memory for arguments against the anchor.
We saw in the discussion of the law of small numbers that a message, unless it is immediately rejected as a lie, will have the same effect on the associative system regardless of its reliability.
The gist of the message is the story, which is based on whatever information is available, even if the quantity of the information is slight and its quality is poor: WYSIATI.
Anchoring effects are threatening in a similar way. You are always aware of the anchor and even pay attention to it, but you do not know how it guides and constrains your thinking, because you cannot imagine how you would have thought if the anchor had been different (or absent). However, you should assume that any number that is on the table has had an anchoring effect on you, and if the stakes are high you should mobilize yourself (your System 2) to combat the effect.
We defined the availability heuristic as the process of judging frequency by “the ease with which instances come to mind.”
We now know that both systems are involved.
The availability heuristic, like other heuristics of judgment, substitutes one question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind.
Resisting this large collection of potential availability biases is possible, but tiresome. You must make the effort to reconsider your impressions and intuitions by asking such questions as, “Is our belief that thefts by teenagers are a major problem due to a few recent instances in our neighborhood?” or “Could it be that I feel no need to get a flu shot because none of my acquaintances got the flu last year?” Maintaining one’s vigilance against biases is a chore—but the chance to avoid a costly mistake is sometimes worth the effort.
You will occasionally do more than your share, but it is useful to know that you are likely to have that feeling even when each member of the team feels the same way.
Schwarz and his colleagues took on this challenge of discovering the conditions under which this reversal would take place.
judgments are no longer influenced by ease of retrieval when the experience of fluency is given a spurious explanation by the presence of curved or straight text boxes, by
the process that leads to judgment by availability appears to involve a complex chain of reasoning. The subjects have an experience of diminishing fluency as they produce instances. They evidently have expectations about the rate at which fluency decreases, and those expectations are wrong: the difficulty of coming up with new instances increases more rapidly than they expect. It is the unexpectedly low fluency that causes people who were asked for twelve instances to describe themselves as unassertive.
When the surprise is eliminated, low fluency no longer influences the judgment. The process appears to consist of a sophisticated set of inferences. Is the automatic System 1 capable of it?
The answer is that in fact no complex reasoning is needed. Among the basic features of System 1 is its ability to set expectations and to be surprised when these expectations are violated. The system also retrieves possible causes of a surpr...
This highlight has been truncated due to consecutive passage length restrictions.
Furthermore, System 2 can reset the expectations of System 1 on the fly, so that an event that would normally be ...
This highlight has been truncated due to consecutive passage length restrictions.
The conclusion is that the ease with which instances come to mind is a System 1 heuristic, which is replaced by a focus on content when System 2 is more engaged. Multiple lines of evidence converge on the conclusion that people who let themselves be guided by System 1 are more strongly susceptible to availability biases than others who are in a state of higher vigilance. The following are some conditions in which people “go with the flow” and are affected more strongly by ease of retrieval than by the content they retrieved: when they are engaged in another effortful task at the same time when
...more
Kunreuther also observed that protective actions, whether by individuals or governments, are usually designed to be adequate to the worst disaster actually experienced.
The lesson is clear: estimates of causes of death are warped by media coverage. The coverage is itself biased toward novelty and poignancy. The media do not just shape what the public is interested in, but also are shaped by it.
they saw that the ease with which ideas of various risks come to mind and the emotional reactions to these
Slovic eventually developed the notion of an affect heuristic, in which people make judgments and decisions by consulting their emotions:
The affect heuristic is an instance of substitution, in which the answer to an easy question (How do I feel about it?) serves as an answer to a much harder question (What do I think about it?).
people’s emotional evaluations of outcomes, and the bodily states and the approach and avoidance tendencies associated with them, all play a central role in guiding decision making. Damasio and his colleagues have observed that people who do not display the appropriate emotions before they decide, sometimes because of brain damage, also have an impaired ability to make good decisions.