More on this book
Community
Kindle Notes & Highlights
we judged the size of categories by the ease with which instances came to mind. We called this reliance on the ease of memory search the availability heuristic.
Social scientists in the 1970s broadly accepted two ideas about human nature. First, people are generally rational, and their thinking is normally sound. Second, emotions such as fear, affection, and hatred explain most of the occasions on which people depart from rationality.
“The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.”
affect heuristic, where judgments and decisions are guided directly by feelings of liking and disliking, with little deliberation or reasoning.
This is the essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.
System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency
The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness.
most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word.
continuous vigilance is not necessarily good, and it is certainly impractical. Constantly questioning our own thinking would be impossibly tedious, and System 2 is much too slow and inefficient to serve as a substitute for System 1 in making routine decisions. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.
the pupil was a good measure of the physical arousal that accompanies mental effort,
System 2 is the only one that can follow rules, compare objects on several attributes, and make deliberate choices between options.
People who experience flow describe it as “a state of effortless concentration so deep that they lose their sense of time, of themselves, of their problems,”
many people are overconfident, prone to place too much faith in their intuitions. They apparently find cognitive effort at least mildly unpleasant and avoid it as much as possible.
when people believe a conclusion is true, they are also very likely to believe arguments that appear to support it, even when these arguments are unsound.
Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed.
System 1 is impulsive and intuitive; System 2 is capable of reasoning, and it is cautious, but at least for some people it is also lazy.
associative activation: ideas that have been evoked trigger many other ideas, in a spreading cascade of activity in your brain.
The notion that we have limited access to the workings of our minds is difficult to accept because, naturally, it is alien to our experience, but it is true: you know far less about yourself than you feel you do.
This remarkable priming phenomenon—the influencing of an action by the idea—is known as the ideomotor effect.
the idea of money primes individualism: a reluctance to be involved with others, to depend on others, or to accept demands from others.
“The world makes much less sense than you think. The coherence comes mostly from the way your mind works.”
A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.
creativity is associative memory that works exceptionally well.
Mood evidently affects the operation of System 1: when we are uncomfortable and unhappy, we lose touch with our intuition.
Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable, and if the jump saves much time and effort. Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no time to collect more information.
when System 2 is otherwise engaged, we will believe almost anything. System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy.
The tendency to like (or dislike) everything about a person—including things you have not observed—is known as the halo effect.
The procedure I adopted to tame the halo effect conforms to a general principle: decorrelate error!
is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.
System 1 represents categories by a prototype or a set of typical exemplars, it deals well with averages but poorly with sums. The size of the category, the number of instances it contains, tends to be ignored in judgments of what I will call sum-like variables.
often compute much more than we want or need. I call this excess computation the mental shotgun.
If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it. I call the operation of answering one question in place of another substitution.
The psychologist Paul Slovic has proposed an affect heuristic in which people let their likes and dislikes determine their beliefs about the world.
System 1 is inept when faced with “merely statistical” facts, which change the probability of outcomes but do not cause them to happen.
If this is the case, the differences between dense and rural counties do not really count as facts: they are what scientists call artifacts, observations that are produced entirely by some aspect of the method of research—in this case, by differences in sample size.
We explained, tongue-in-cheek, that “intuitions about random sampling appear to satisfy the law of small numbers, which asserts that the law of large numbers applies to small numbers as well.”
We do not expect to see regularity produced by a random process, and when we detect what appears to be a rule, we quickly reject the idea that the process is truly random.
you will more often than not err by misclassifying a random event as systematic. We are far too willing to reject the belief that much of what we see in life is random.
anchoring effect. It occurs when people consider a particular value for an unknown quantity before estimating that quantity.
My hunch was that anchoring is a case of suggestion. This is the word we use when someone causes us to see, hear, or feel something by merely bringing it to mind.
We defined the availability heuristic as the process of judging frequency by “the ease with which instances come to mind.”
The conclusion is that the ease with which instances come to mind is a System 1 heuristic, which is replaced by a focus on content when System 2 is more engaged. Multiple lines of evidence converge on the conclusion that people who let themselves be guided by System 1 are more strongly susceptible to availability biases than others who are in a state of higher vigilance.
basic limitation in the ability of our mind to deal with small risks: we either ignore them altogether or give them far too much weight—nothing in between.
the amount of concern is not adequately sensitive to the probability of harm; you are imagining the numerator—the tragic story you saw on the news—and not thinking about the denominator. Sunstein has coined the phrase “probability neglect” to describe the pattern.
The essential keys to disciplined Bayesian reasoning can be simply summarized: Anchor your judgment of the probability of an outcome on a plausible base rate. Question the diagnosticity of your evidence.
This is a trap for forecasters and their clients: adding detail to scenarios makes them more persuasive, but less likely to come true.
The social norm against stereotyping, including the opposition to profiling, has been highly beneficial in creating a more civilized and more equal society. It is useful to remember, however, that neglecting valid stereotypes inevitably results in suboptimal judgments. Resistance to stereotyping is a laudable moral position, but the simplistic idea that the resistance is costless is wrong.
Statistical results with a causal interpretation have a stronger effect on our thinking than noncausal information. But even compelling causal statistics will not change long-held beliefs or beliefs rooted in personal experience. On the other hand, surprising individual cases have a powerful impact and are a more effective tool for teaching psychology because the incongruity must be resolved and embedded in a causal story.
whenever the correlation between two scores is imperfect, there will be regression to the mean.