More on this book
Community
Kindle Notes & Highlights
If a strong predictive cue exists, human observers will find it, given a decent opportunity to do so. Statistical algorithms greatly outdo humans in noisy environments for two reasons: they are more likely than human judges to detect weakly valid cues and much more likely to maintain a modest level of accuracy by using such cues consistently.
It is wrong to blame anyone for failing to forecast accurately in an unpredictable world. However, it seems fair to blame professionals for believing they can succeed in an impossible task.
intuition cannot be trusted in the absence of stable regularities in the environment.
Whether professionals have a chance to develop intuitive expertise depends essentially on the quality and speed of feedback, as well as on sufficient opportunity to practice.
System 1 is often able to produce quick answers to difficult questions by substitution, creating coherence where there is none.
the proper way to elicit information from a group is not by starting with a public discussion but by confidentially collecting each person’s judgment.
the “unknown unknowns.” There was no way for us to foresee, that day, the succession of events that would cause the project to drag out for so long.
There are many ways for any plan to fail, and although most of them are too improbable to be anticipated, the likelihood that something will go wrong in a big project is high.
The argument for the outside view should be made on general grounds: if the reference class is properly chosen, the outside view will give an indication of where the ballpark is, and it may suggest, as it did in our case, that the inside-view forecasts are not even close to it.
people who have information about an individual case rarely feel the need to know the statistics of the class to which the case belongs.
In the competition with the inside view, the outside view doesn’t stand a chance.
overly optimistic forecasts of the outcome of projects are found everywhere. Amos and I coined the term planning fallacy to describe plans and forecasts that are unrealistically close to best-case scenarios could be improved by consulting the statistics of similar cases
The failures of forecasting in these cases reflect the customers’ inability to imagine how much their wishes will escalate over time.
The treatment for the planning fallacy has now acquired a technical name, reference class forecasting
people often (but not always) take on risky projects because they are overly optimistic about the odds they face.
sunk-cost fallacy
pervasive optimistic bias. Most of us view the world as more benign than it really is, our own attributes as more favorable than they truly are, and the goals we adopt as more achievable than they are likely to be.
Optimistic individuals play a disproportionate role in shaping our lives. Their decisions make a difference; they are the inventors, the entrepreneurs, the political and military leaders—not average people.
More often than not, risk takers underestimate the odds they face, and do not invest sufficient effort to find out what the odds are. Because they misread the risks, optimistic entrepreneurs often believe they are prudent, even when they are not.
When action is needed, optimism, even of the mildly delusional variety, may be a good thing.
One of the benefits of an optimistic temperament is that it encourages persistence in the face of obstacles. But persistence can be costly.
The optimistic risk taking of entrepreneurs surely contributes to the economic dynamism of a capitalistic society, even if most risk takers end up disappointed.
illusion of control
The upshot is that people tend to be overly optimistic about their relative standing on any activity in which they do moderately well.
Overconfidence is another manifestation of WYSIATI: when we estimate a quantity, we rely on information that comes to mind and construct a coherent story in which the estimate makes sense.
inadequate appreciation of the uncertainty of the environment inevitably leads economic agents to take risks they should avoid.
people and firms reward the providers of dangerously misleading information more than they reward truth tellers. One of the lessons of the financial crisis that led to the Great Recession is that there are periods in which competition, among experts and among organizations, creates powerful forces that favor a collective blindness to risk and uncertainty.
An unbiased appreciation of uncertainty is a cornerstone of rationality—but it is not what people and organizations want. Extreme uncertainty is paralyzing under dangerous circumstances, and the admission that one is merely guessing is especially unacceptable when the stakes are high. Acting on pretended knowledge is often the preferred solution.
There is no evidence that risk takers in the economic domain have an unusual appetite for gambles on high stakes; they are merely less aware of risks than more timid people are.
The effects of high optimism on decision making are, at best, a mixed blessing, but the contribution of optimism to good implementation is certainly positive. The main benefit of optimism is resilience in the face of setbacks.
The main virtue of the premortem is that it legitimizes doubts.
Expected utility theory
an increase of stimulus intensity by a given factor (say, times 1.5 or times 10) always yields the same increment on the psychological scale.
Many of the options we face in life are “mixed”: there is a risk of loss and an opportunity for gain, and we must decide whether to accept the gamble or reject it.
“loss aversion ratio”
In the mixed case, the possible loss looms twice as large as the possible gain, as you can see by comparing the slopes of the value function for losses and gains. In the bad case, the bending of the value curve (diminishing sensitivity) causes risk seeking. The pain of losing $900 is more than 90% of the pain of losing $1,000.
prospect theory cannot deal with disappointment.
Prospect theory and utility theory also fail to allow for regret.
The omission of the reference point from the indifference map is a surprising case of theory-induced blindness, because we so often encounter cases in which the reference point obviously matters.
It is difficult to accept changes for the worse.
First, tastes are not fixed; they vary with the reference point. Second, the disadvantages of a change loom larger than its advantages, inducing a bias that favors the status quo.
loss aversion does not imply that you never prefer to change your situation; the benefits of an opportunity may exceed even overweighted losses. Loss aversion implies only that choices are strongly biased in favor of the reference situation (and generally biased to favor small rather than large changes).
The values were unequal because of loss aversion: giving up a bottle of nice wine is more painful than getting an equally good bottle is pleasurable.
The magic of the market did not work for a good that the owners expected to use.
Loss aversion is built into the automatic evaluations of System 1.
Selling goods that one would normally use activates regions of the brain that are associated with disgust and pain.
There are goods that the poor need and cannot afford, so they are always “in the losses.” Small amounts of money that they receive are therefore perceived as a reduced loss, not as a gain.
People who are poor think like traders, but the dynamics are quite different. Unlike traders, the poor are not indifferent to the differences between gaining and giving up. Their problem is that all their choices are between losses. Money that is spent on one good is the loss of another good that could have been purchased instead. For the poor, costs are losses.
The brains of humans and other animals contain a mechanism that is designed to give priority to bad news.
There is no real threat, but the mere reminder of a bad event is treated in System 1 as threatening.