More on this book
Community
Kindle Notes & Highlights
Read between
May 11 - August 26, 2020
But this particular humiliation, on December 17, 2010, caused Mohamed Bouazizi, aged twenty-six, to set himself on fire, and Bouazizi’s self-immolation sparked protests. The police responded with typical brutality. The protests spread. Hoping to assuage the public, the dictator of Tunisia, President Zine el-Abidine Ben Ali, visited Bouazizi in the hospital. Bouazizi died on January 4, 2011. The unrest grew. On January 14, Ben Ali fled to a cushy exile in Saudi Arabia, ending his twenty-three-year kleptocracy.
In 1972 the American meteorologist Edward Lorenz wrote a paper with an arresting title: “Predictability: Does the Flap of a Butterfly’s Wings in Brazil Set Off a Tornado in Texas?”
It was an insight that would inspire “chaos theory”: in nonlinear systems like the atmosphere, even small changes in initial conditions can mushroom to enormous proportions.
In a world where a butterfly in Brazil can make the difference between just another sunny day in Texas and a tornado tearing through a town, it’s misguided to think anyone can see very far into the future.
“I have been struck by how important measurement is to improving the human condition,” Bill Gates wrote. “You can achieve incredible progress if you set a clear goal and find a measure that will drive progress toward that goal….This may seem basic, but it is amazing how often it is not done and how hard it is to get right.”8 He is right about what it takes to drive progress, and it is surprising how rarely it’s done in forecasting. Even that simple first step—setting a clear goal—hasn’t been taken.
In year 1, GJP beat the official control group by 60%. In year 2, we beat the control group by 78%. GJP also beat its university-affiliated competitors, including the University of Michigan and MIT, by hefty margins, from 30% to 70%, and even outperformed professional intelligence analysts with access to classified data. After two years, GJP was doing so much better than its academic competitors that IARPA dropped the other teams.10
Superforecasting does require minimum levels of intelligence, numeracy, and knowledge of the world, but anyone who reads serious books about psychological research probably has those prerequisites.
broadly speaking, superforecasting demands thinking that is open-minded, careful, curious, and—above all—self-critical. It also demands focus. The kind of thinking that produces superior judgment does not come effortlessly. Only the determined can deliver it reasonably consistently, which is why our analyses have consistently found commitment to self-improvement to be the strongest predictor of performance.
Today, it’s no longer impossible to imagine a forecasting competition in which a supercomputer trounces superforecasters and superpundits alike. After that happens, there will still be human forecasters, but like human Jeopardy! contestants, we will only watch them for entertainment.
“So what I want is that human expert paired with a computer to overcome the human cognitive limitations and biases.”14
What about bloodletting? Everyone from the ancient Greeks to George Washington’s doctors swore that it was wonderfully restorative, but did it work?
Consider Galen, the second-century physician to Roman emperors.
And Galen was untroubled by doubt. Each outcome confirmed he was right, no matter how equivocal the evidence might look to someone less wise than the master. “All who drink of this treatment recover in a short time, except those whom it does not help, who all die,” he wrote. “It is obvious, therefore, that it fails only in incurable cases.”
Wow - staggering that this was once commonly accepted. Before the scientific method, things were chaos
It was cargo cult science, a term of mockery coined much later by the physicist Richard Feynman
So cargo cult science has the outward form of science but lacks what makes it truly scientific.
What medicine lacked was doubt. “Doubt is not a fearful thing,” Feynman observed, “but a thing of very great value.”10 It’s what propels science forward.
the National Health Service—the British health care system—
The numbering of the two systems is not arbitrary. System 1 comes first. It is fast and constantly running in the background. If a question is asked and you instantly know the answer, it sprang from System 1. System 2 is charged with interrogating that answer. Does it stand up to scrutiny? Is it backed by evidence? This process takes time and effort, which is why the standard routine in decision making is this: first System 1 delivers an answer, and only then can System 2 get involved, starting with an examination of what System 1 decided.
As Daniel Kahneman puts it, “System 1 is designed to jump to conclusions from little evidence.”
You see the shadow. Snap! You are frightened—and running. That’s the “availability heuristic,” one of many System 1 operations—or heuristics—discovered by Daniel Kahneman, his collaborator Amos Tversky,
These tacit assumptions are so vital to System 1 that Kahneman gave them an ungainly but oddly memorable label: WYSIATI (What You See Is All There Is).
“split-brain” patients, meaning that the left and right hemispheres of their brains could not communicate with each other because the connection between them, the corpus callosum, had been surgically severed (traditionally as a treatment for severe epilepsy).
like a split-brain patient asked why he is pointing at a picture of a shovel when he has no idea why, the journalist conjures a plausible story from whatever is at hand.
pure confirmation bias: “If the patient is cured, it is evidence my treatment works; if the patient dies, it means nothing.”
I call it bait and switch: when faced with a hard question, we often surreptitiously replace it with an easy one. “Should I worry about the shadow in the long grass?” is a hard question. Without more data, it may be unanswerable. So we substitute an easier question: “Can I easily recall a lion attacking someone from the long grass?”
With training or experience, people can encode patterns deep in their memories in vast number and intricate detail—such as the estimated fifty thousand to one hundred thousand chess positions that top players have in their repertoire.
it is very likely that there are early indications that a building is about to collapse in a fire or that an infant will soon show obvious symptoms of infection,” Kahneman and Klein wrote. “On the other hand, it is unlikely that there is publicly available information that could be used to predict how well a particular stock will do—if such valid information existed, the price of the stock would already reflect it. Thus, we have more reason to trust the intuition of an experienced fireground commander about the stability of a building, or the intuitions of a nurse about an infant, than to
...more
Carlsen respects his intuition, as well he should, but he also does a lot of “double-checking” because he knows that sometimes intuition can let him down and conscious thought can improve his judgment.
so if you have the time to think before making a big decision, do so—and be prepared to accept that what seems obviously true now may turn out to be false later.
the president of Digital Equipment Corporation declaring in 1977 that “there is no reason anyone would want a computer in their home.”
The first step in learning what works in forecasting, and what doesn’t, is to judge forecasts, and to do that we can’t make assumptions about what the forecast means. We have to know.
Opposition to the arms race brought millions to the streets of cities across the Western world. In June 1982 an estimated seven hundred thousand people marched in New York City in one of the biggest demonstrations in American history.
Gorbachev changed direction swiftly and sharply. His policies of glasnost (openness) and perestroika (restructuring) liberalized the Soviet Union. Gorbachev also sought to normalize relations with the United States and reverse the arms race.
Obviously, a forecast without a time frame is absurd. And yet, forecasters routinely make them,
Coordinator of Information (COI) in 1941. The COI became the Office of Strategic Services (OSS). The OSS became the Central Intelligence Agency (CIA).
In the late 1940s, the Communist government of Yugoslavia broke from the Soviet Union, raising fears the Soviets would invade.
They had all agreed to use “serious possibility” in the NIE so Kent asked each person, in turn, what he thought it meant. One analyst said it meant odds of about 80 to 20, or four times more likely than not that there would be an invasion. Another thought it meant odds of 20 to 80—exactly the opposite. Other answers were scattered between those extremes. Kent was floored.