The Signal and the Noise Quotes

Rate this book
Clear rating
The Signal and the Noise: Why So Many Predictions Fail—But Some Don't The Signal and the Noise: Why So Many Predictions Fail—But Some Don't by Nate Silver
52,067 ratings, 3.97 average rating, 3,522 reviews
Open Preview
The Signal and the Noise Quotes Showing 181-210 of 204
“The name overfitting comes from the way that statistical models are “fit” to match past observations. The fit can be too loose—this is called underfitting—in which case you will not be capturing as much of the signal as you could. Or it can be too tight—an overfit model—which means that you’re fitting the noise in the data rather than discovering its underlying structure.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail—But Some Don't
“In statistics, the name given to the act of mistaking noise for a signal is overfitting.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail—But Some Don't
“It was hard to tell the signal from the noise. The story the data tells us is often the one we’d like to hear, and we usually make sure that it has a happy ending.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail—But Some Don't
“Indeed, the big brokerage firms tend to avoid standing out from the crowd, downgrading a stock only after its problems have become obvious. In October 2001, fifteen of the seventeen analysts following Enron still had a “buy” or “strong buy” recommendation on the stock even though it had already lost 50 percent of its value in the midst of the company’s accounting scandal.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail—But Some Don't
“The most robust evidence indicates that this wisdom-of-crowds principle holds when forecasts are made independently before being averaged together. In a true betting market (including the stock market), people can and do react to one another’s behavior.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail—But Some Don't
“Meanwhile, exposure to so many new ideas was producing mass confusion. The amount of information was increasing much more rapidly than our understanding of what to do with it, or our ability to differentiate the useful information from the mistruths.13 Paradoxically, the result of having so much more shared knowledge was increasing isolation along national and religious lines. The instinctual shortcut that we take when we have “too much information” is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies of the rest.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“Big Data” was on its way becoming a Big Idea. Google searches for the term doubled over the course of a year,1 as did mentions of it in the news media.2 Hundreds of books were published on the subject. If you picked up any business periodical in 2013, advertisements for Big Data were as ubiquitous as cigarettes in an episode of”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“The fashionable term now is “Big Data.” IBM estimates that we are generating 2.5 quintillion bytes of data each day, more than 90 percent of which was created in the last two years.36 This exponential growth in information is sometimes seen as a cure-all, as computers were in the 1970s. Chris Anderson, the editor of Wired magazine, wrote in 2008 that the sheer volume of data would obviate the need for theory, and even the scientific method.37 This is an emphatically pro-science and pro-technology book, and I think of it as a very optimistic one. But it argues that these views are badly mistaken. The numbers have no way of speaking for themselves. We speak for them. We imbue them with meaning. Like Caesar, we may construe them in self-serving ways that are detached from their objective reality. Data-driven predictions can succeed—and they can fail. It is when we deny our role in the process that the odds of failure rise. Before we demand more of our data, we need to demand more of ourselves.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“The dysfunctional state of the American political system is the best reason to be pessimistic about our country’s future. Our scientific and technological prowess is the best reason to be optimistic. We are an inventive people. The United States produces ridiculous numbers of patents,114 has many of the world’s best universities and research institutions, and our companies lead the market in fields ranging from pharmaceuticals to information technology. If I had a choice between a tournament of ideas and a political cage match, I know which fight I’d rather be engaging in—especially if I thought I had the right forecast.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“As the statistician George E. P. Box wrote, “All models are wrong, but some models are useful.”90 What he meant by that is that all models are simplifications of the universe, as they must necessarily be. As another mathematician said, “The best model of a cat is a cat.”91”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“There are no guarantees that flu predictions will do better the next time around. In fact, the flu and other infectious diseases have several properties that make them intrinsically very challenging to predict.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“The Bayesian Invisible Hand
… free-market capitalism and Bayes’ theorem come out of something of the same intellectual tradition. Adam Smith and Thomas Bayes were contemporaries, and both were educated in Scotland and were heavily influenced by the philosopher David Hume. Smith’s 'Invisible hand' might be thought of as a Bayesian process, in which prices are gradually updated in response to changes in supply and demand, eventually reaching some equilibrium. Or, Bayesian reasoning might be thought of as an 'invisible hand' wherein we gradually update and improve our beliefs as we debate our ideas, sometimes placing bets on them when we can’t agree. Both are consensus-seeking processes that take advantage of the wisdom of crowds.
It might follow, then, that markets are an especially good way to make predictions. That’s really what the stock market is: a series of predictions about the future earnings and dividends of a company. My view is that this notion is 'mostly' right 'most' of the time. I advocate the use of betting markets for forecasting economic variables like GDP, for instance. One might expect these markets to improve predictions for the simple reason that they force us to put our money where our mouth is, and create an incentive for our forecasts to be accurate.
Another viewpoint, the efficient-market hypothesis, makes this point much more forcefully: it holds that it is 'impossible' under certain conditions to outpredict markets. This view, which was the orthodoxy in economics departments for several decades, has become unpopular given the recent bubbles and busts in the market, some of which seemed predictable after the fact. But, the theory is more robust than you might think.
And yet, a central premise of this book is that we must accept the fallibility of our judgment if we want to come to more accurate predictions. To the extent that markets are reflections of our collective judgment, they are fallible too. In fact, a market that makes perfect predictions is a logical impossibility.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail—But Some Don't
“Sometimes the need to adapt the forecast to the consumer can take on comical dimensions. For many years, the Weather Channel had indicated rain on their radar maps with green shading (occasionally accompanied by yellow and red for severe storms). At some point in 2001, someone in the marketing department got the bright idea to make rain blue instead—which is, after all, what we think of as the color of water. The Weather Channel was quickly besieged with phone calls from outraged—and occasionally terrified—consumers, some of whom mistook the blue blotches for some kind of heretofore unknown precipitation (plasma storms? radioactive fallout?). “That was a nuclear meltdown,” Dr. Rose told me. “Somebody wrote in and said, ‘For years you’ve been telling us that rain is green—and now it’s blue? What madness is this?”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“The instinctual shortcut that we take when we have "too much information" is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies of the rest.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail—But Some Don't
“An oft-told joke: a statistician drowned crossing a river that was only three feet deep on average.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“The best model of a cat is a cat.”91 Everything else is leaving out some sort of detail. How pertinent that detail might be will depend on exactly what problem we’re trying to solve and on how precise an answer we require.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“how much our relatively narrow range of experience can color our interpretation of the evidence.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“the average, like the family with 1.7 children, is just a statistical abstraction.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“If you talk with the very best players, they don’t take any of their success for granted; they focus as much as they can on self-improvement.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“The first 20 percent often begins with having the right data, the right technology, and the right incentives. You need to have some information—more of it rather than less, ideally—and you need to make sure that it is quality-controlled. You need to have some familiarity with the tools of your trade—having top-shelf technology is nice, but it’s more important that you know how to use what you have. You need to care about accuracy—about getting at the objective truth—rather than about making the most pleasing or convenient prediction, or the one that might get you on television. Then you might progress to a few intermediate steps, developing some rules of thumb (heuristics) that are grounded in experience and common sense and some systematic process to make a forecast rather than doing so on an ad hoc basis.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“Be wary, however, when you come across phrases like “the computer thinks the Yankees will win the World Series.” If these are used as shorthand for a more precise phrase (“the output of the computer program is that the Yankees will win the World Series”), they may be totally benign. With all the information in the world today, it’s certainly helpful to have machines that can make calculations much faster than we can. But if you get the sense that the forecaster means this more literally—that he thinks of the computer as a sentient being, or the model as having a mind of its own—it may be a sign that there isn’t much thinking going on at all. Whatever biases and blind spots the forecaster has are sure to be replicated in his computer program. We have to view technology as what it always has been—a tool for the betterment of the human condition. We should neither worship at the altar of technology nor be frightened by it. Nobody has yet designed, and perhaps no one ever will, a computer that thinks like a human being.49 But computers are themselves a reflection of human progress and human ingenuity: it is not really “artificial” intelligence if a human designed the artifice.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't
“A long-term study by Philip E. Tetlock of the University of Pennsylvania found that when political scientists claimed that a political outcome had absolutely no chance of occurring, it nevertheless happened about 15 percent of the time.”
Nate Silver, The Signal and the Noise: The Art and Science of Prediction
“Greed and fear are volatile quantities, however, and the balance can get out of whack. When there is an excess of greed in the system, there is a bubble. When there is an excess of fear, there is a panic.”
Nate Silver, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't

1 2 3 4 5 7 next »