Thinking, Fast and Slow
Rate it:
Open Preview
Started reading January 20, 2018
2%
Flag icon
we judged the size of categories by the ease with which instances came to mind. We called this reliance on the ease of memory search the availability heuristic.
2%
Flag icon
Social scientists in the 1970s broadly accepted two ideas about human nature. First, people are generally rational, and their thinking is normally sound. Second, emotions such as fear, affection, and hatred explain most of the occasions on which people depart from rationality.
3%
Flag icon
“The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.”
3%
Flag icon
affect heuristic, where judgments and decisions are guided directly by feelings of liking and disliking, with little deliberation or reasoning.
3%
Flag icon
This is the essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.
4%
Flag icon
System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency
4%
Flag icon
The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness.
5%
Flag icon
most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word.
5%
Flag icon
continuous vigilance is not necessarily good, and it is certainly impractical. Constantly questioning our own thinking would be impossibly tedious, and System 2 is much too slow and inefficient to serve as a substitute for System 1 in making routine decisions. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.
6%
Flag icon
the pupil was a good measure of the physical arousal that accompanies mental effort,
7%
Flag icon
System 2 is the only one that can follow rules, compare objects on several attributes, and make deliberate choices between options.
7%
Flag icon
People who experience flow describe it as “a state of effortless concentration so deep that they lose their sense of time, of themselves, of their problems,”
8%
Flag icon
many people are overconfident, prone to place too much faith in their intuitions. They apparently find cognitive effort at least mildly unpleasant and avoid it as much as possible.
8%
Flag icon
when people believe a conclusion is true, they are also very likely to believe arguments that appear to support it, even when these arguments are unsound.
9%
Flag icon
Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed.
9%
Flag icon
System 1 is impulsive and intuitive; System 2 is capable of reasoning, and it is cautious, but at least for some people it is also lazy.
10%
Flag icon
associative activation: ideas that have been evoked trigger many other ideas, in a spreading cascade of activity in your brain.
10%
Flag icon
The notion that we have limited access to the workings of our minds is difficult to accept because, naturally, it is alien to our experience, but it is true: you know far less about yourself than you feel you do.
10%
Flag icon
This remarkable priming phenomenon—the influencing of an action by the idea—is known as the ideomotor effect.
11%
Flag icon
the idea of money primes individualism: a reluctance to be involved with others, to depend on others, or to accept demands from others.
11%
Flag icon
“The world makes much less sense than you think. The coherence comes mostly from the way your mind works.”
12%
Flag icon
A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.
13%
Flag icon
creativity is associative memory that works exceptionally well.
13%
Flag icon
Mood evidently affects the operation of System 1: when we are uncomfortable and unhappy, we lose touch with our intuition.
15%
Flag icon
Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable, and if the jump saves much time and effort. Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no time to collect more information.
15%
Flag icon
when System 2 is otherwise engaged, we will believe almost anything. System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy.
16%
Flag icon
The tendency to like (or dislike) everything about a person—including things you have not observed—is known as the halo effect.
16%
Flag icon
The procedure I adopted to tame the halo effect conforms to a general principle: decorrelate error!
17%
Flag icon
is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.
18%
Flag icon
System 1 represents categories by a prototype or a set of typical exemplars, it deals well with averages but poorly with sums. The size of the category, the number of instances it contains, tends to be ignored in judgments of what I will call sum-like variables.
18%
Flag icon
often compute much more than we want or need. I call this excess computation the mental shotgun.
19%
Flag icon
If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it. I call the operation of answering one question in place of another substitution.
20%
Flag icon
The psychologist Paul Slovic has proposed an affect heuristic in which people let their likes and dislikes determine their beliefs about the world.
20%
Flag icon
System 1 is inept when faced with “merely statistical” facts, which change the probability of outcomes but do not cause them to happen.
21%
Flag icon
If this is the case, the differences between dense and rural counties do not really count as facts: they are what scientists call artifacts, observations that are produced entirely by some aspect of the method of research—in this case, by differences in sample size.
21%
Flag icon
We explained, tongue-in-cheek, that “intuitions about random sampling appear to satisfy the law of small numbers, which asserts that the law of large numbers applies to small numbers as well.”
21%
Flag icon
We do not expect to see regularity produced by a random process, and when we detect what appears to be a rule, we quickly reject the idea that the process is truly random.
22%
Flag icon
you will more often than not err by misclassifying a random event as systematic. We are far too willing to reject the belief that much of what we see in life is random.
22%
Flag icon
anchoring effect. It occurs when people consider a particular value for an unknown quantity before estimating that quantity.
23%
Flag icon
My hunch was that anchoring is a case of suggestion. This is the word we use when someone causes us to see, hear, or feel something by merely bringing it to mind.
24%
Flag icon
We defined the availability heuristic as the process of judging frequency by “the ease with which instances come to mind.”
25%
Flag icon
The conclusion is that the ease with which instances come to mind is a System 1 heuristic, which is replaced by a focus on content when System 2 is more engaged. Multiple lines of evidence converge on the conclusion that people who let themselves be guided by System 1 are more strongly susceptible to availability biases than others who are in a state of higher vigilance.
27%
Flag icon
basic limitation in the ability of our mind to deal with small risks: we either ignore them altogether or give them far too much weight—nothing in between.
27%
Flag icon
the amount of concern is not adequately sensitive to the probability of harm; you are imagining the numerator—the tragic story you saw on the news—and not thinking about the denominator. Sunstein has coined the phrase “probability neglect” to describe the pattern.
29%
Flag icon
The essential keys to disciplined Bayesian reasoning can be simply summarized: Anchor your judgment of the probability of an outcome on a plausible base rate. Question the diagnosticity of your evidence.
30%
Flag icon
This is a trap for forecasters and their clients: adding detail to scenarios makes them more persuasive, but less likely to come true.
32%
Flag icon
The social norm against stereotyping, including the opposition to profiling, has been highly beneficial in creating a more civilized and more equal society. It is useful to remember, however, that neglecting valid stereotypes inevitably results in suboptimal judgments. Resistance to stereotyping is a laudable moral position, but the simplistic idea that the resistance is costless is wrong.
33%
Flag icon
Statistical results with a causal interpretation have a stronger effect on our thinking than noncausal information. But even compelling causal statistics will not change long-held beliefs or beliefs rooted in personal experience. On the other hand, surprising individual cases have a powerful impact and are a more effective tool for teaching psychology because the incongruity must be resolved and embedded in a causal story.
34%
Flag icon
whenever the correlation between two scores is imperfect, there will be regression to the mean.
38%
Flag icon
Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.
Pratik liked this
« Prev 1 3