More on this book
Community
Kindle Notes & Highlights
Read between
May 11 - August 26, 2020
Harry Truman once joked that he wanted to hear from a one-armed economist because he was sick of hearing “on the one hand…on the other…”—a joke that bears more than a passing resemblance to Tversky’s.
Scientific facts that look as solid as rock to one generation of scientists can be crushed to dust beneath the advances of the next.16 All scientific knowledge is tentative. Nothing is chiseled in granite. In practice, of course, scientists do use the language
Epistemic uncertainty is something you don’t know but is, at least in theory, knowable.
Aleatory uncertainty is something you not only don’t know; it is unknowable.
Aleatory uncertainty ensures life will always have surprises, regardless of how carefully we plan.
Superforecasters were much more granular. Fully one-third of their forecasts used the single percentage point scale, meaning they would think carefully and decide that the chance of something happening was, say, 3% rather than 4%. Like the Treasury aide taught to think in fine-grained probabilities by his boss, Robert
“There is a 73% probability Apple’s stock will finish the year 24% above where it started.” Toss in a few technical terms most people don’t understand—“stochastic” this, “regression” that—and you can use people’s justified respect for math and science to get them nodding along. This is granularity as bafflegab. It is unfortunately common.
Science doesn’t tackle “why” questions about the purpose of life. It sticks to “how” questions that focus on causation and probabilities.
counterfactual thinking, which is thinking about how something might have turned out differently than it actually did.
students who had engaged in counterfactual thinking—imagining the different choices they might have made—imbued their decision to come to Northwestern with more meaning.
Again, imagining how things might have turned out differently caused people to imbue the relationship with deeper significance.
Think about the love of your life and the countless events that had to happen as they did to bring the two of you together. If you had studied that night rather than gone to the party. Or your spouse had walked a bit faster and not missed that train. Or you had accepted your friend’s invitation to go out of town that weekend.
Most people don’t think “Wow, what luck!” Instead, they take the sheer improbability of it happening, and the fact it happened, as proof it was meant to happen.
Big Bang theory tells us how finely tuned the laws of nature need to be for stars, planets, and life to arise. Even tiny deviations and we would not exist. Most people don’t respond to that observation by saying “Wow, we were lucky!”—or by wondering whether billions of Big Bangs generated billions of parallel universes, a few of which turned out by chance to be life friendly. Some physicists think this way. But most of us suspect that something—perhaps God—was behind it. It was meant to be.
The probabilistic thinker would say, “Yes, it was extremely improbable that I would meet my partner that night, but I had to be somewhere and she had to be somewhere and happily for us our somewheres coincided.”
The economist and Nobel laureate Robert Shiller
The regular forecasters were a little lower still. And the superforecasters got the lowest score of all, firmly on the rejection-of-fate side.
So finding meaning in events is positively correlated with wellbeing but negatively correlated with foresight. That sets up a depressing possibility: Is misery the price of accuracy?
Superforecasters update much more frequently, on average, than regular forecasters.
But a development like this, and the response it calls for, are clear to everyone, and no one can produce superior forecasts only by staying on top of what everyone knows. What makes the difference is correctly identifying and responding to subtler information so you zero in on the eventual outcome faster than others.
You have to home in on subtle information to make forecasts better than those of the crowds; you can’t just use information that everyone else knows
the Yasukuni Shrine, Bill strongly believed the answer was no. Yasukuni was founded in 1869 to honor Japan’s war dead and now lists almost 2.5 million soldiers. Conservatives like Abe revere it. But included among the names of the honored dead are those of about one thousand war criminals, including fourteen “class A” criminals. Visits to Yasukuni by Japanese leaders outrage the Chinese and Korean governments, and the government of the United States, Japan’s chief ally, constantly urges Japanese prime ministers not to damage relations this way.
So there are two dangers a forecaster faces after making the initial call. One is not giving enough weight to new information. That’s underreaction. The other danger is overreacting to new information, seeing it as more meaningful than it is, and adjusting a forecast too radically.
Reflecting on his mistake, Bill told me, “I think that the question I was really answering wasn’t ‘Will Abe visit Yasukuni?’ but ‘If I were PM of Japan, would I visit Yasukuni?’
Having strayed from the real question, Bill dismissed the new information because it was irrelevant to his replacement question.
On December 7, 1941, when the Japanese Imperial Navy attacked the United States at Pearl Harbor,
Earl Warren. At the time, Warren was attorney general of California. Later, he became governor, then chief justice of the US Supreme Court—and is remembered today as the liberal champion of school desegregation and civil rights.4
But civil rights were not at the tip of Warren’s nose in World War II. Security was. His solution to the perceived threat was to round up and imprison every man, woman, and child of Japanese descent, a plan carried out between mid-February and August 1942, when 112,000 people—two-thirds of whom had been born in the United States—were shipped to isolated camps ringed with barbed wire and armed guards.
Holy shit that’s horrible - innocent Japanese Americans were rounded up following Pearl Harbor - over 100,000 of them
Social psychologists have long known that getting people to publicly commit to a belief is a great way to freeze it in place, making it resistant to change. The stronger the commitment, the greater the resistance.
The Yale professor Dan Kahan has done much research showing that our judgments about risks—Does gun control make us safer or put us in danger?—are driven less by a careful weighing of evidence than by our identities, which is why people’s views on gun control often correlate with their views on climate change, even though the two issues have no logical connection to each other.
Beliefs are often about our identities more than a conclusion we reached after following the evidence
As expected, those who got the irrelevant information lost confidence. Why? With nothing to go on but evidence that fits their stereotype of a good student or a child abuser, the signal feels strong and clear—and our judgment reflects that. But add irrelevant information and we can’t help but see Robert or David more as a person than a stereotype, which weakens the fit.
Many studies have found that those who trade more frequently get worse returns than those who lean toward old-fashioned buy-and-hold strategies. Malkiel cited one study of sixty-six thousand American households over a five-year period
Massive time and effort went into those trades and yet the people who made them would have been better off if they had gone golfing.
Greek mythology, any discussion of two opposing dangers called for Scylla and Charybdis. Scylla was a rock shoal off the coast of Italy. Charybdis was a whirlpool on the coast of Sicily, not far away. Sailors knew they would be doomed if they strayed too far in either direction. Forecasters should feel the same about under- and overreaction to new information, the Scylla and Charybdis of forecasting. Good updating is all about finding the middle passage.
And notice how small Tim’s changes are. There are no dramatic swings of thirty or forty percentage points. The average update was tiny, only 3.5%. That was critical.
A few small updates would have put Tim on a heading for underreaction. Many large updates could have tipped him toward overreaction. But with many small updates, Tim slipped safely between Scylla and Charybdis.
A forecaster who doesn’t adjust her views in light of new information won’t capture the value of that information, while a forecaster who is so impressed by the new information that he bases his forecast entirely on it will lose the value of the old information that underpinned his prior forecast. But the forecaster who carefully balances old and new captures the value in both—and puts it into her new forecast. The best way to do that is by updating often but bit by bit.
Thomas Bayes. A Presbyterian minister, educated in logic, Bayes was born in 1701, so he lived at the dawn of modern probability theory,
P(H|D)/P(-H|D) = P(D|H) • P(D|-H) • P(H)/P(-H) Posterior Odds = Likelihood Ratio • Prior Odds The Bayesian belief-updating equation In simple terms, the theorem says that your new belief should depend on two things—your prior belief (and all the knowledge that informed it) multiplied by the “diagnostic value” of the new information.
“Bayes’ theorem requires us to estimate two things: 1) how likely are we to see a poor Senate performance when the nominee is destined to fail and 2) how likely are we to see a poor performance when the nominee is bound for approval?”
So in Bayes’ theory it seems like we’re really factoring in how likely it is that any given event is an outlier
What matters far more to the superforecasters than Bayes’ theorem is Bayes’ core insight of gradually getting closer to the truth by constantly updating in proportion to the weight of the evidence.
So Bayes’ theorem is a way of constantly getting closer to the truth by constantly updating your forecasts in proportion to the weight of the evidence