More on this book
Community
Kindle Notes & Highlights
by
Nate Silver
Read between
January 31 - July 8, 2016
The instinctual shortcut that we take when we have “too much information” is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies of the rest.
We need to stop, and admit it: we have a prediction problem. We love to predict things—and we aren’t very good at it.
The signal is the truth. The noise is what distracts us from the truth. This is a book about the signal and the noise.
Human beings have an extraordinary capacity to ignore risks that threaten their livelihood, as though this will make them go away.
“The fox knows many little things, but the hedgehog knows one big thing.”
Foxes, on the other hand, are scrappy creatures who believe in a plethora of little ideas and in taking a multitude of approaches toward a problem. They tend to be more tolerant of nuance, uncertainty, complexity, and dissenting opinion. If hedgehogs are hunters, always looking out for the big kill, then foxes are gatherers.
But foxes happen to make much better predictions. They are quicker to recognize how noisy the data can be, and they are less inclined to chase false signals. They know more about what they don’t know.
“When the facts change, I change my mind,” the economist John Maynard Keynes famously said. “What do you do, sir?”
The way to become more objective is to recognize the influence that our assumptions play in our forecasts and to question ourselves about them. In politics, between our ideological predispositions and our propensity to weave tidy narratives from noisy data, this can be especially difficult.
He doesn’t give a shit. I would come to realize that without that attitude, Pedroia might have let the scouting reports go to his head and never have made the big leagues.
collect as much information as possible, but then be as rigorous and disciplined as possible when analyzing it.
Perfect predictions are impossible if the universe itself is random.
Some of the type of thinking I encourage in this book can probably be helpful in the realm of national security analysis.87 For instance, the Bayesian approach toward thinking about probability is more compatible with decision making under high uncertainty. It encourages us to hold a large number of hypotheses in our head at once, to think about them probabilistically, and to update them frequently when we come across new information that might be more or less consistent with them.