More on this book
Community
Kindle Notes & Highlights
Systematic errors are known as biases, and they recur predictably in particular circumstances.
when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.
a puzzling limitation of our mind: our excessive confidence in what we believe we know, and our apparent inability to acknowledge the full extent of our ignorance and the uncertainty of the world we live in.
Overconfidence is fed by the illusory certainty of hindsight.
we can be blind to the obvious, and we are also blind to our blindness.
Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed.
familiarity is not easily distinguished from truth.
you must first know what the idea would mean if it were true. Only then can you decide whether or not to unbelieve it.
The operations of associative memory contribute to a general confirmation bias
The tendency to like (or dislike) everything about a person—including things you have not observed—is known as the halo effect.
the halo effect increases the weight of first impressions, sometimes to the point that subsequent information is mostly wasted.
It is the consistency of the information that matters for a good story, not its completeness.
Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.
The technical definition of heuristic is a simple procedure that helps find adequate, though often imperfect, answers to difficult questions. The word comes from the same root as eureka.
a judgment that is based on substitution will inevitably be biased in predictable ways.
intuitions about random sampling appear to satisfy the law of small numbers, which asserts that the law of large numbers applies to small numbers as well.
you are likely to stop when you are no longer sure you should go farther—at the near edge of the region of uncertainty.
System 1 understands sentences by trying to make them true, and the selective activation of compatible thoughts produces a family of systematic errors that make us gullible and prone to believe too strongly whatever we believe.
You can discover how the heuristic leads to biases by following a simple procedure: list factors other than frequency that make it easy to come up with instances.
The world in our heads is not a precise replica of reality;
The affect heuristic is an instance of substitution, in which the answer to an easy question (How do I feel about it?) serves as an answer to a much harder question (What do I think about it?).
The emotional tail wags the rational dog.
“all heuristics are equal, but availability is more equal than the others.”
Anchor your judgment of the probability of an outcome on a plausible base rate. Question the diagnosticity of your evidence.
the notions of coherence, plausibility, and probability are easily confused by the unwary.
adding detail to scenarios makes them more persuasive, but less likely to come true.
neglecting valid stereotypes inevitably results in suboptimal judgments.
Resistance to stereotyping is a laudable moral position, but the simplistic idea that the resistance is costless is wrong. The costs are worth paying to achieve a better society, but denying that the costs exist, while satisfying to the soul and politically correct, is not scientifically defensible.
Subjects’ unwillingness to deduce the particular from the general was matched only by their willingness to infer the general from the particular.
But even compelling causal statistics will not change long-held beliefs or beliefs rooted in personal experience.
Intensity matching yields predictions that are as extreme as the evidence on which they are based,
Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.
Once you adopt a new view of the world (or of any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.
Your inability to reconstruct past beliefs will inevitably cause you to underestimate the extent to which you were surprised by past events.
The tendency to revise the history of one’s beliefs in light of what actually happened produces a robust cognitive illusion.
Hindsight bias has pernicious effects on the evaluations of decision makers. It leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad.
In the presence of randomness, regular patterns can only be mirages.
The idea that the future is unpredictable is undermined every day by the ease with which the past is explained.
“We reach the point of diminishing marginal predictive returns for knowledge disconcertingly quickly,
“She has a coherent story that explains all she knows, and the coherence makes her feel good.”
“She is a hedgehog. She has a theory that explains everything, and it gives her the illusion that she understands the world.”
Intuition is nothing more and nothing less than recognition.”
true experts know the limits of their knowledge.
intuition cannot be trusted in the absence of stable regularities in the environment.
She is very confident in her decision, but subjective confidence is a poor index of the accuracy of a judgment.
In terms of its consequences for decisions, the optimistic bias may well be the most significant of the cognitive biases.
Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients. An unbiased appreciation of uncertainty is a cornerstone of rationality—but it is not what people and organizations want.