Risk Savvy: How to Make Good Decisions
Rate it:
Read between December 11, 2019 - March 28, 2020
1%
Flag icon
Risk literacy is the basic knowledge required to deal with a modern technological society. The breakneck speed of technological innovation will make risk literacy as indispensable in the twenty-first century as reading and writing were in previous centuries.
2%
Flag icon
New forecasting technology has enabled meteorologists to replace mere verbal statements of certainty (“it will rain tomorrow”) or chance (“it is likely”) with numerical precision. But greater precision has not led to greater understanding of what the message really is.
2%
Flag icon
Always ask for the reference class: Percent of what?
3%
Flag icon
Always ask: What is the absolute risk increase?
3%
Flag icon
It should be the ethical responsibility of every editor to enforce transparent reporting and it should be on the agenda of every ethics committee and every department of health. But it is not.
4%
Flag icon
If reason conflicts with a strong emotion, don’t try to argue. Enlist a conflicting and stronger emotion.
5%
Flag icon
My story is different. People aren’t stupid. The problem is that our educational system has an amazing blind spot concerning risk literacy.
5%
Flag icon
And we teach our children biology but not the psychology that shapes their fears and desires.
5%
Flag icon
The most astounding part is our collective amnesia: Most of us are still anxious to see stock market predictions even if they have been consistently wrong year after year.
6%
Flag icon
believe that an expert horoscope is absolutely certain.1 Yet there is no evidence that horoscopes do better than a good friend asked to predict your future. But when technology is involved, the illusion of certainty is amplified.
7%
Flag icon
RISK: If risks are known, good decisions require logic and statistical thinking. UNCERTAINTY: If some risks are unknown, good decisions also require intuition and smart rules of thumb.
7%
Flag icon
Probability is not one of a kind; it was born with three faces: frequency, physical design, and degrees of belief.
7%
Flag icon
Probabilities by design are called propensities
8%
Flag icon
As noted in chapter 1, “a 30 percent chance of rain tomorrow” is a single-event probability, while “it will rain on 30 percent of the days for which this announcement is made” is a frequency statement that makes the reference class clear (days, not region or time).
8%
Flag icon
Every rule of thumb I am aware of can be used consciously and unconsciously. If it is used unconsciously, the resulting judgment is called intuitive.
9%
Flag icon
To assume that intelligence is necessarily conscious and deliberate is a big error. Most parts of our brain are unconscious, and we would be doomed without the vast experience stored there.
9%
Flag icon
Calculated intelligence may do the job for known risks, but in the face of uncertainty, intuition is indispensable.
9%
Flag icon
It is called a heuristic because it focuses on the one or few pieces of information that are important and ignores the rest. Experts often search for less information than novices do, using heuristics instead.
10%
Flag icon
Figure 2-5. What does a positive HIV test mean? The answer depends on the prevalence and the false-positive rate. Left tree: The false-positive rate is 5 in 100,000 according to Amy’s doctor. The prevalence is 1 in 10,000 for women with no known risk factors. Among 100,000 women who take the test, 10 will likely be infected, and also correctly test positive. Among the 99,990 not infected, the test is likely to err in five cases—that is, result in a false positive. Thus, we expect that 15 persons will test positive but only 10 will actually be infected. Right tree: Here, the false-positive rate ...more
11%
Flag icon
There is a similarity between the turkey’s unexpected disaster and experts’ inability to anticipate financial crises: Both use models that might work in the short run, but cannot foresee the disaster looming ahead.
11%
Flag icon
Because the housing prices kept rising, the risk seemed to decline. Confidence in stability was highest before the onset of the subprime crisis.
11%
Flag icon
The problem is improper risk measurement: methods that wrongly assume known risks in a world of uncertainty. Because these calculations generate precise numbers for an uncertain risk, they produce an illusory certainty.
11%
Flag icon
Some advice is considered so obvious that it cannot be but true. More information is always better. More calculation is always better. As we will see, this is a big mistake. In an uncertain world, complex decision making methods involving more information and calculation are often worse and can cause damage by invoking unwarranted certainty.
11%
Flag icon
When making decisions, so the argument continues, rules of thumb are always second best. Yet that is only true in a world of known risk, not in an uncertain world. To make good decisions in an uncertain world, one has to ignore part of the information, which is exactly what rules of thumb do. Doing so can save time and effort and lead to better decisions.
12%
Flag icon
If you work in the middle management of a company, your life probably revolves around the fear of doing something wrong and being blamed for it. Such a climate is not a good one for innovation, because originality requires taking risks and making errors along the way. No risks, no errors, no innovation.
13%
Flag icon
Making such “errors” is not a flaw; without them we wouldn’t recognize the objects around us. If a system does not make errors, it is not intelligent. Visual illusions in fact demonstrate the success rather than the failure of cognition.
14%
Flag icon
To quote the head of risk management of an international airline: “If we had the safety culture of a hospital, we would crash two planes a day.”
14%
Flag icon
To show that the effect of the checklist was not restricted to his hospital, Pronovost got more than a hundred ICUs in Michigan to cooperate in a large study. Importantly, each ICU was encouraged to develop its own checklist to fit their unique barriers and culture. The participating ICUs had reported a total of 695 catheter-related bloodstream infections annually before the study. Only three months after the introduction of the checklists, most ICUs cut the infection rate to zero.
15%
Flag icon
Why are checklists used in every cockpit but not in every ICU? Both are largely commercial enterprises, so why is it so much safer in a plane than in a hospital? The answer can be found in their different error cultures. First, the hierarchical structure in hospitals is not fertile ground for checklists, which, as mentioned, might require a female nurse to remind a male surgeon to wash his hands. Second, the consequences affect both parties equally in aviation: If passengers die in a crash, pilots die as well, whereas if patients die, doctors’ lives are not endangered. Third, when a plane ...more
15%
Flag icon
Immediately he was attacked by his peer surgeons for making facts about patient safety public, and, rather than praising him for his openness, the press bashed the entire medical profession for their shoddy work. Zero tolerance for talking about errors produces more errors and less patient safety.
15%
Flag icon
Defensive Decision Making Many a committee meeting ends with “We need more data.” Everybody nods, breathing a sigh of relief, happy that the decision has been deferred. A week or so later, when the data are in, the group is no further ahead. Everyone’s time is wasted on another meeting, on waiting for even more data. The culprit is a negative error culture, in which everyone lacks the courage to make a decision for which they may be punished. Not making a decision or procrastinating in order to avoid responsibility is the most blatant form of defensive decision making. If something goes wrong, ...more
15%
Flag icon
Defensive decision making: A person or group ranks option A as the best for the situation, but chooses an inferior option B to protect itself in case something goes wrong.
15%
Flag icon
Recognition heuristic: If you recognize the name of one company but not that of the other, then infer that the recognized company provides the better value. This simple rule is often a good guide.12 But it can lead to the dominance of a few firms that grow bigger and bigger and can no longer deliver the best quality.
16%
Flag icon
Defensive medicine: A doctor orders tests or treatments that are not clinically indicated and might even harm the patient primarily because of fear of litigation.
17%
Flag icon
Don’t ask your doctors what they recommend to you, ask them what they would do if it were their mother, brother, or child.
18%
Flag icon
To avoid blame, people hide behind “safe” procedures. Rely on big names, do what everyone else does, and don’t listen to your intuition.
18%
Flag icon
Health insurers pay doctors and clinics a fortune for overtreatment, but only pennies for taking the time to explain to patients what the treatment alternatives are and their actual benefits and harms.
19%
Flag icon
Fear whatever your social group fears. This simple principle protects us when personal experience might be lethal. At the same time, it can also make us fear the wrong things.
20%
Flag icon
Thus, only 41 percent of Europeans understood that ordinary tomatoes also have genes, while the rest believed that nature made them without genes, or did not know.7 How the majority thinks vegetables reproduce remains a mystery. Part of the fear of genetically modified tomatoes seems to be grounded in basic ignorance about biology.
20%
Flag icon
Whereas liver and heart conditions suggest that the cause is within, the American vision is that the body itself is healthy and the enemy comes from without.
21%
Flag icon
In risk research people are sometimes divided into two kinds of personalities: risk seeking and risk averse. But it is misleading to generalize a person as one or the other. The very same person who is averse to the risk of genetically modified corn may be a chain smoker, and another who is terrified of burning wax candles on a Christmas tree may be willing to risk having a gun at home. Social learning is the reason why people aren’t generally risk seeking or risk averse. They tend to fear whatever their peers fear, resulting in a patchwork of risks taken and avoided.
21%
Flag icon
biological preparedness is about learning to fear the dangers of old. Social imitation, in contrast, allows us to learn about new dangers.
22%
Flag icon
Annual polls of college freshmen showed that recent generations judged “being well off financially” as more important than “developing a meaningful philosophy of life,” while the opposite was true in the sixties and seventies.
22%
Flag icon
The Internal-External Locus of Control Scale is a questionnaire that measures how much people believe to be in control of their own fate, as opposed to being controlled by others. The questionnaire was given to children aged nine to fourteen from 1960 to 2002. During that time children’s belief that they have control over their own destinies substantially declined. In 2002 the average child reported higher external control than 80 percent of their peers in 1960. When children experience little internal control over their lives, they tend to become anxious in the face of uncertainty: I’m ...more
23%
Flag icon
The banks’ inability to predict has nothing to do with being overly confident or cautious in general. That time round, all thirty banks were too pessimistic, only to be far too optimistic a year later.
24%
Flag icon
The larger picture is that bank analysts underestimate the volatility of the stock market and of the exchange rate. At fault for one are the mathematical models they use. These treat the highly unpredictable financial market as if its risks were predictable. As a consequence, the forecasts consistently miss large upswings or downswings and only do well if nothing remarkable happens—that is, when last year’s trend continues.
24%
Flag icon
But when Markowitz made his own investments for his retirement, he did not use his Nobel Prize–winning method. Instead, he employed a simple rule of thumb called 1/N: Allocate your money equally to each of N funds.
25%
Flag icon
Say you invested in fifty funds. How many years of stock data would be needed before the mean-variance method finally does better than 1/N? A computer simulation provides the answer: about five hundred years! That means that in the year 2500, investors can turn from the simple rule to the high-level math of the mean-variance model and hope to win. But this holds only if the same stocks—and the stock market—are still around.
25%
Flag icon
After the podium, the head of investment of a major international insurance company came up to me and said he would check his company’s investments. Three weeks later, he arrived at my office with his assistant. “I checked our investments beginning 1969. I compared 1/N to our actual investment strategies. We would have made more money if we had used this simple rule of thumb.” But then the real issue came up. “I have convinced myself that simple is better. But here’s my problem. How do I explain this to my customers? They might say, I can do that myself!”
25%
Flag icon
How far we go in simplifying depends on three features. First, the more uncertainty, the more we should make it simple. The less uncertainty, the more complex it should be.
« Prev 1