More on this book
Community
Kindle Notes & Highlights
Statisticians call that the base rate—how common something is within a broader class. Daniel Kahneman has a much more evocative visual term for it. He calls it the “outside view”—in contrast to the “inside view,” which is the specifics of the particular case.
It’s natural to be drawn to the inside view. It’s usually concrete and filled with engaging detail we can use to craft a story about what’s going on. The outside view is typically abstract, bare, and doesn’t lend itself so readily to storytelling. So even smart, accomplished people routinely fail to consider the outside view.
You may wonder why the outside view should come first. After all, you could dive into the inside view and draw conclusions, then turn to the outside view. Wouldn’t that work as well? Unfortunately, no, it probably wouldn’t. The reason is a basic psychological concept called anchoring.
The number we start with is called the anchor. It’s important because we typically underadjust, which means a bad anchor can easily produce a bad estimate.
But if she starts with the outside view, her analysis will begin with an anchor that is meaningful. And a better anchor is a distinct advantage.
A good exploration of the inside view does not involve wandering around, soaking up any and all information and hoping that insight somehow emerges. It is targeted and purposeful: it is an investigation, not an amble.
it is detective work as real investigators do it, not the detectives on TV shows. It’s methodical, slow, and demanding. But it works far better than wandering aimlessly in a forest of information.
So you have an outside view and an inside view. Now they have to be merged, just as your brain merges the different perspectives of your two eyeballs into a single vision.
Coming up with an outside view, an inside view, and a synthesis of the two isn’t the end. It’s a good beginning. Superforecasters constantly look for other views they can synthesize into their own.
Researchers have found that merely asking people to assume their initial judgment is wrong, to seriously consider why that might be, and then make another judgment, produces a second estimate which, when combined with the first, improves accuracy almost as much as getting a second estimate from another person.
This approach, built on the “wisdom of the crowd” concept, has been called “the crowd within.”
There is an even simpler way of getting another perspective on a question: tweak its wording.
Superforecasters pursue point-counterpoint discussions routinely, and they keep at them long past the point where most people would succumb to migraines.
In personality psychology, one of the “Big Five” traits is “openness to experience,” which has various dimensions, including preference for variety and intellectual curiosity. It’s unmistakable in many superforecasters.
For superforecasters, beliefs are hypotheses to be tested, not treasures to be guarded.
I have yet to find a superforecaster who isn’t comfortable with numbers and most are more than capable of putting them to practical use.
On Wall Street, math wizards are called quants, and the math they use can get a lot more esoteric than Monte Carlo models.
While superforecasters do occasionally deploy their own explicit math models, or consult other people’s, that’s rare. The great majority of their forecasts are simply the product of careful thought and nuanced judgment.
Superior numeracy does help superforecasters, but not because it lets them tap into arcane math models that divine the future. The truth is simpler, subtler, and much more interesting.
A smart executive will not expect universal agreement, and will treat its appearance as a warning flag that groupthink has taken hold. An array of judgments is welcome proof that the people around the table are actually thinking for themselves and offering their unique perspectives.
Obama told him in a later interview, “In this situation, what you started to get was probabilities that disguised uncertainty as opposed to actually providing you with useful information.”
the ignorance prior, the state of knowledge you are in before you know whether the coin will land heads or tails
Both 0% and 100% weigh far more heavily in our minds than the mathematical models of economists say they should.
our ancestors couldn’t maintain a state of constant alert. The cognitive cost would have been too great. They needed worry-free zones. The solution? Ignore small chances and use the two-setting dial as much as possible. Either it is a lion or it isn’t.
A confident yes or no is satisfying in a way that maybe never is, a fact that helps to explain why the media so often turn to hedgehogs who are sure they know what is coming no matter how bad their forecasting records may be.
people equate confidence and competence, which makes the forecaster who says something has a middling probability of happening less worthy of respect.
But outside of a classroom, away from abstractions, when dealing with real issues, these educated, accomplished people reverted to the intuitive.
One of twentieth-century science’s great accomplishments has been to show that uncertainty is an ineradicable element of reality.
“unlikely,” but as we have seen that introduces dangerous ambiguity, which is why scientists prefer numbers. And those numbers should be as finely subdivided as forecasters can manage.
The finer grained the better, as long as the granularity captures real distinctions—
Epistemic uncertainty is something you don’t know but is, at least in theory, knowable. If you wanted to predict the workings of a mystery machine, skilled engineers could, in theory, pry it open and figure it out.
Aleatory uncertainty is something you not only don’t know; it is unknowable.
Forecasters who use a three-setting mental dial are much likelier to use 50% when they are asked to make probabilistic judgments because they use it as a stand-in for maybe. Hence, we should expect frequent users of 50% to be less accurate.
Superforecasters were much more granular. Fully one-third of their forecasts used the single percentage point scale,
Only the naive ask “Why?” Those who see reality more clearly don’t bother. It’s a trenchant insight. When something unlikely and important happens it’s deeply human to ask “Why?”
Meaning is a basic human need. As much research shows, the ability to find it is a marker of a healthy, resilient mind.
those who had contemplated alternative paths in life saw the path taken as meant to be.
Most people don’t think “Wow, what luck!” Instead, they take the sheer improbability of it happening, and the fact it happened, as proof it was meant to happen.
In Kahneman’s terms, probabilistic thinkers take the outside view toward even profoundly identity-defining events, seeing them as quasi-random draws from distributions of once-possible worlds. Or, in Kurt Vonnegut’s terms, “Why me? Why not me?”
we should expect superforecasters to be much less inclined to see things as fated.
the more a forecaster embraced probabilistic thinking, the more accurate she was.
So finding meaning in events is positively correlated with wellbeing but negatively correlated with foresight. That sets up a depressing possibility: Is misery the price of accuracy? I don’t know.
A forecast that is updated to reflect the latest available information is likely to be closer to the truth than a forecast that isn’t so informed.
“When the facts change, I change my mind,” the legendary British economist John Maynard Keynes declared. “What do you do, sir?”
Superforecasters do monitor the news carefully and factor it into their forecasts, which is bound to give them a big advantage over the less attentive.
But that’s not the whole story. For one thing, superforecasters’ initial forecasts were at least 50% more accurate than those of regular forecasters.
no one can produce superior forecasts only by staying on top of what everyone knows. What makes the difference is correctly identifying and responding to subtler information so you zero in on the eventual outcome faster than others.
People can be astonishingly intransigent—and capable of rationalizing like crazy to avoid acknowledging new information that upsets their settled beliefs.
Social psychologists have long known that getting people to publicly commit to a belief is a great way to freeze it in place, making it resistant to change. The stronger the commitment, the greater the resistance.
Our beliefs about ourselves and the world are built on each other in a Jenga-like fashion.

