Thinking, Fast and Slow
Rate it:
Kindle Notes & Highlights
Read between April 18, 2017 - January 19, 2020
35%
Flag icon
In The Black Swan, Taleb introduced the notion of a narrative fallacy to describe how flawed stories of the past shape our views of the world and our expectations for the future.
35%
Flag icon
Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen.
35%
Flag icon
Taleb suggests that we humans constantly fool ourselves by constructing flimsy accounts of the past...
This highlight has been truncated due to consecutive passage length restrictions.
35%
Flag icon
The halo effect discussed earlier contributes to coherence, because it inclines us to match our view of all the qualities of a person to our judgment of one attribute that is particularly significant.
35%
Flag icon
The halo effect helps keep explanatory narratives simple and coherent by exaggerating the consistency of evaluations:
35%
Flag icon
Inconsistencies reduce the ease of our thoughts and the clarity of our feelings.
35%
Flag icon
A compelling narrative fosters an illusion of inevitability.
35%
Flag icon
The ultimate test of an explanation is whether it would have made the event predictable in advance.
35%
Flag icon
The human mind does not deal well with nonevents.
35%
Flag icon
Of course there was a great deal of skill in the Google story, but luck played a more important role in the actual event than it does in the telling of it. And the more luck was involved, the less there is to be learned.
35%
Flag icon
You build the best possible story from the information available to you, and if it is a good story, you believe it.
35%
Flag icon
Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle.
35%
Flag icon
Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited a...
This highlight has been truncated due to consecutive passage length restrictions.
35%
Flag icon
We can know something only if it is both true and knowable.
35%
Flag icon
What is perverse about the use of know in this context is not that some individuals get credit for prescience that they do not deserve. It is that the language implies that the world is more knowable than it is. It helps perpetuate a pernicious illusion.
35%
Flag icon
The core of the illusion is that we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less than we believe we do.
35%
Flag icon
To think clearly about the future, we need to clean up the language that we use in labeling the beliefs we had in the past.
35%
Flag icon
Learning from surprises is a reasonable thing to do, but it can have some dangerous consequences.
35%
Flag icon
A general limitation of the human mind is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or of any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.
36%
Flag icon
Asked to reconstruct their former beliefs, people retrieve their current ones instead—an instance of substitution—and many cannot believe that they ever felt differently.
36%
Flag icon
Fischhoff first demonstrated this “I-knew-it-all-along” effect, or hindsight bias,
36%
Flag icon
The results were clear. If an event had actually occurred, people exaggerated the probability that they had assigned to it earlier. If the possible event had not come to pass, the participants erroneously recalled that they had always considered it unlikely.
36%
Flag icon
The tendency to revise the history of one’s beliefs in light of what actually happened produces a robust cognitive illusion.
36%
Flag icon
Hindsight bias has pernicious effects on the evaluations of decision makers. It leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad.
36%
Flag icon
This outcome bias makes it almost impossible to evaluate a decision properly—in terms of the beliefs that were reasonable when the decision was made.
36%
Flag icon
Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact.
36%
Flag icon
Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.
36%
Flag icon
The worse the consequence, the greater the hindsight bias.
36%
Flag icon
Because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions—and to an extreme reluctance to take risks. As malpractice litigation became more common, physicians changed their procedures in multiple ways: ordered more tests, referred more cases to specialists, applied conventional treatments even when they were unlikely to help. These actions protected the physicians more than they benefited the patients, creating the potential for conflicts of interest.
Jon Bell
Comfort of continuing by a standard vs doing what is best.
36%
Flag icon
Although hindsight and the outcome bias generally foster risk aversion, they also bring undeserved rewards to irresponsible risk seekers, such as a general or an entrepreneur who took a crazy gamble and won.
36%
Flag icon
A few lucky gambles can crown a reckless leader with a halo of prescience and boldness.
36%
Flag icon
The sense-making machinery of System 1 makes us see the world as more tidy, simple, predictable, and coherent than it really is. The illusion that one has understood the past feeds the further illusion that one can predict and control the future.
36%
Flag icon
If you expected this value to be higher—and most of us do—then you should take that as an indication that you are prone to overestimate the predictability of the world you live in.
36%
Flag icon
Because of the halo effect, we get the causal relationship backward: we are prone to believe that the firm fails because its CEO is rigid, when the truth is that the CEO appears to be rigid because the firm is failing. This is how illusions of understanding are born.
36%
Flag icon
Because luck plays a large role, the quality of leadership and management practices cannot be inferred reliably from observations of success. And even if you had perfect foreknowledge that a CEO has brilliant vision and extraordinary competence, you still would be unable to predict how the company will perform with much better accuracy than the flip of a coin.
37%
Flag icon
Stories of how businesses rise and fall strike a chord with readers by offering what the human mind needs: a simple message of triumph and failure that identifies clear causes and ignores the determinative power of luck and the inevitability of regression. These stories induce and maintain an illusion of understanding, imparting lessons of little enduring value to readers who are all too eager to believe them.
37%
Flag icon
System 1 is designed to jump to conclusions from little evidence—and it is not designed to know the size of its jumps.
37%
Flag icon
For some of our most important beliefs we have no evidence at all, except that people we love and trust hold these beliefs. Considering how little we know, the confidence we have in our beliefs is preposterous—and it is also essential.
37%
Flag icon
I was reminded of the Müller-Lyer illusion, in which we know the lines are of equal length yet still see them as being different. I was so struck by the analogy that I coined a term for our experience: the illusion of validity.
37%
Flag icon
Looking back, the most striking part of the story is that our knowledge of the general rule—that we could not predict—had no effect on our confidence in individual cases.
37%
Flag icon
Just as Nisbett and Borgida showed, people are often reluctant to infer the particular from the general.
37%
Flag icon
Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.
38%
Flag icon
a basic test of skill: persistent achievement. The diagnostic for the existence of any skill is the consistency of individual differences in achievement.
38%
Flag icon
the evidence from more than fifty years of research is conclusive: for a large majority of fund managers, the selection of stocks is more like rolling dice than like playing poker. Typically at least two out of every three mutual funds underperform the overall market in any given year.
38%
Flag icon
The illusion of skill is not only an individual aberration; it is deeply ingrained in the culture of the industry. Facts that challenge such basic assumptions—and thereby threaten people’s livelihood and self-esteem—are simply not absorbed. The mind does not digest them. This is particularly true of statistical studies of performance, which provide base-rate information that people generally ignore when it clashes with their personal impressions from experience.
38%
Flag icon
Finally, the illusions of validity and skill are supported by a powerful professional culture. We know that people can maintain an unshakable faith in any proposition, however absurd, when they are sustained by a community of like-minded believers.
39%
Flag icon
Those who know more forecast very slightly better than those who know less. But those with the most knowledge are often less reliable. The reason is that the person who acquires more knowledge develops an enhanced illusion of her skill and becomes unrealistically overconfident.
39%
Flag icon
Experts are led astray not by what they believe, but by how they think,
39%
Flag icon
The main point of this chapter is not that people who attempt to predict the future make many errors; that goes without saying. The first lesson is that errors of prediction are inevitable because the world is unpredictable.
39%
Flag icon
The second is that high subjective confidence is not to be trusted as an indicator of accuracy (low confidence could be more informative).
1 6 12