Jump to ratings and reviews
Rate this book

The Signal and the Noise: Why So Many Predictions Fail—But Some Don't

Rate this book
Nate Silver built an innovative system for predicting baseball performance, predicted the 2008 election within a hair's breadth. He solidified his standing as the nation's foremost political forecaster with his near perfect prediction of the 2012 election. Silver is the founder and editor in chief of FiveThirtyEight.com.

Drawing on his own groundbreaking work, Silver examines the world of prediction, investigating how we can distinguish a true signal from a universe of noisy data. Most predictions fail, often at great cost to society, because most of us have a poor understanding of probability and uncertainty. Both experts and laypeople mistake more confident predictions for more accurate ones. But overconfidence is often the reason for failure. If our appreciation of uncertainty improves, our predictions can get better too. This is the "prediction paradox": The more humility we have about our ability to make predictions, the more successful we can be in planning for the future.

In keeping with his own aim to seek truth from data, Silver visits the most successful forecasters in a range of areas, from hurricanes to baseball, from the poker table to the stock market, from Capitol Hill to the NBA. He explains and evaluates how these forecasters think and what bonds they share. What lies behind their success? Are they good-or just lucky? What patterns have they unraveled? And are their forecasts really right? He explores unanticipated commonalities and exposes unexpected juxtapositions. And sometimes, it is not so much how good a prediction is in an absolute sense that matters but how good it is relative to the competition. In other cases, prediction is still a very rudimentary-and dangerous-science.

Silver observes that the most accurate forecasters tend to have a superior command of probability, and they tend to be both humble and hardworking. They distinguish the predictable from the unpredictable, and they notice a thousand little details that lead them closer to the truth. Because of their appreciation of probability, they can distinguish the signal from the noise.

544 pages, Hardcover

First published September 27, 2012

Loading interface...
Loading interface...

About the author

Nate Silver

7 books477 followers
Nathaniel Read "Nate" Silver (born January 13, 1978) is an American statistician and writer who analyzes baseball and elections. He is currently the editor-in-chief of ESPN's FiveThirtyEight blog and a Special Correspondent for ABC News. Silver first gained public recognition for developing PECOTA, a system for forecasting the performance and career development of Major League Baseball players, which he sold to and then managed for Baseball Prospectus from 2003 to 2009.

In 2007, writing under the pseudonym "Poblano", Silver began to publish analyses and predictions related to the 2008 United States presidential election. At first this work appeared on the political blog Daily Kos, but in March 2008 Silver established his own website, FiveThirtyEight.com. By summer of that year, after he revealed his identity to his readers, he began to appear as an electoral and political analyst in national print, online, and cable news media.

The accuracy of his November 2008 presidential election predictions—he correctly predicted the winner of 49 of the 50 states—won Silver further attention and commendation. The only state he missed was Indiana, which went for Barack Obama by one percentage point. He correctly predicted the winner of all 35 U.S. Senate races that year.

In April 2009, he was named one of The World's 100 Most Influential People by Time.

In 2010, Silver's FiveThirtyEight. blog was licensed for publication by The New York Times. The newly renamed blog, FiveThirtyEight: Nate Silver's Political Calculus, first appeared in The Times on August 25, 2010. In 2012 and 2013, FiveThirtyEight won Webby Awards as the "Best Political Blog" from the International Academy of Digital Arts and Sciences.

Silver's book, The Signal and the Noise , was published in September 2012. It subsequently reached The New York Times best seller list for nonfiction, and was named by Amazon.com as the #1 best nonfiction book of 2012. The Signal and the Noise won the 2013 Phi Beta Kappa Award in Science. The book has been published in eight languages.

In the 2012 United States presidential election between Barack Obama and Mitt Romney, he correctly predicted the winner of all 50 states and the District of Columbia. That same year, Silver's predictions of U.S. Senate races were correct in 31 of 33 states; he predicted Republican victory in North Dakota and Montana, where Democrats won.

In July 2013, it was revealed that Silver and his FiveThirtyEight blog would depart The New York Times and join ESPN. In his new role at ESPN, Silver would become editor-in-chief of the FiveThirtyEight site. ESPN would own the FiveThirtyEight site and the brand. The ESPN-owned FiveThirtyEight launched on March 17, 2014. Silver's lead article explained that the site would focus on a broad range of subjects under the general rubric of "data journalism".

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
15,291 (31%)
4 stars
20,628 (42%)
3 stars
9,779 (20%)
2 stars
2,243 (4%)
1 star
803 (1%)
Displaying 1 - 30 of 3,278 reviews
Profile Image for Justin.
21 reviews9 followers
March 26, 2013
I had read most of this book with a fair degree of equanimity - finding some faults, but also a lot of good information in it. Then I'm jarred out of complacency by a sudden shot from nowhere, in which he says that David Hume, one of the greatest philosophers of the 18th century, is simply too 'daft to understand' probabilistic arguments. Without any introduction to the subject, he claims Hume is stuck in some 'skeptical shell' that prevents him from understanding the simple, elegant solutions of Bayes.

What makes this so painful to read is that it shows Silver has never even taken the time to read Hume, at least not more than the two paragraphs he used to cite his sources. If he had even kept on for five more pages he would have found that Hume was defending the very type of probabilistic arguments that Silver said Hume was 'too daft' to understand. Nate seems to have given a cursory glance to a single page of Hume's work - "SCEPTICAL DOUBTS CONCERNING THE OPERATIONS OF THE UNDERSTANDING," without even bothering to proceed to the very next section - "SCEPTICAL SOLUTION OF THESE DOUBTS," in which Hume lays a rational foundation for belief in the absence of certainty.

In fact, the entire 'Enquiry of Human Understanding' can be read as a treatise attempting to supplant abstract and questionable a priori proofs, with more sensible arguments grounded entirely in the test of experience and probability. By brushing Hume aside so casually, Silver spits in the face of his own philosophical progenitor - a man who helped plant the foundations for the sort of thinking that Silver now takes for granted.

I am sure the vast majority of readers will roll a bemused eye at my anger over trivial details like this - but not only does it show that Silver very often doesn't take the time to understand his sources (see Michael Mann's critique of Silver's presentation of global warming), but Silver's casual remarks could easily turn a lot of readers off to Hume before they've even read him. Trendy books like Silvers are far more popular than classic works of philosophy, and new readers are likely to take Silver's description as an accurate portrayal of that daft, old skeptic, David Hume.
Profile Image for Michael Austin.
Author 113 books236 followers
October 28, 2012
Nate Silver has done an incredible (and, quite possibly an unpredictable) thing with _The Signal and the Noise_: He has written an extremely good book when he didn't even have to. Nothing is more common than for someone like Silver--a media phenom with a strong platform (his 538 blog) to phone a book in to cash in on his 15 minutes. I have probably read two dozen books in the past five years that do exactly this. But _The Signal and the Noise_ is a much more substantial book than, say, _The Black Swan_ or either of the _Freakonomics_ offerings. It is a wide-ranging, in-depth look at the ways that we are wired to make predictions (and the reasons that these are so often wrong).

Silver ranges over a variety of prediction environments: baseball, chess, poker, the stock market, politics, weather, and terrorist attacks to name the most interesting. Throughout it all, he reminds us that human beings are pattern-seeking animals and that we are just as likely to build patterns where none exist as we are to find the correct patterns and harness their predictive capacity. Predictions work best when they are 1) probabilistic (i.e., express a range of possibilities and assign probabilities for each); 2) when they use as much information--both statistical and analytical--as possible; and 3) when they are continually revised to account for new information.

As logical as these sound, human nature seems to drive us in three opposite directions: 1) we seek predictions that are definite and can be acted upon (i.e. "Obama will beat Romney," or "it will rain tomorrow"); 2) we gravitate towards methodologies that seem to discover a magic bullet formula that guarantees success; and 3) we feel compelled to stand by our predictions even as they become increasingly unlikely. Seasoned prognosticators play a long game. Under the right circumstances (a poker game, for example), a strategy that produces only a sightly better prediction than random chance can produce huge dividends.

Perhaps most surprisingly, Silver is a great writer (or, at least a great explainer). As an English major with very little grounding in statistics, I could still understand everything he said. Even more importantly, his narratives are interesting. Who could have predicted that from America's most famous stat-geek?
Profile Image for Charles.
256 reviews40 followers
January 1, 2013
The Signal and the Noise is a very interesting book with mixed success: 3 1/2 stars, were this permitted. I found it somewhat difficult to review; however, my entire book group – without exception – had similar opinions. I would encourage you to view this as a group opinion.

At its best, TSANTN is interesting, illustrative, educational, and provocative. And many chapters – including banking, the weather, volcanoes, elections, and poker – were exactly that. Four stars, without hesitation. The problem is that some chapters – including baseball, terrorists, and the last several – were dull. Either too long or too scattered or just not interesting. (Again, this was the unanimous opinion among my group.)

Nate Silver is a wunderkind polymath, who has scored resounding successes in statistical applications to baseball, poker, and, most recently and most impressively, politics. He emphasizes that huge bunches of data are the tools needed for predictions and that there are huge bunches of data out there. He calmly points out that some things are predictable and are predicted, using various methods with resultant various success. Some things that are predictable are not predicted accurately, exactly because the wrong tools or approaches are used. He equally argues that some things are not predictable, and when predicted, have, predictably, low success. Poor predictors often share the characteristics of ignorance of facts, inappropriate application of basic probability analyses, and, especially, overconfidence. Forecasts are made more inaccurate by overfitting – confusing noise for signal.

His grasp of applied math and statistics is refreshing. His application – although, perhaps not the explanation - of Bayes theorem is lucid. His writing style is casual, more impressive considering the subject material.

As has been noted by others, the number of typographical errors is unacceptable. An even greater editorial error is letting the author ramble on (again, in some chapters). Liberal use of both a sharp red pencil and an X-Acto knife would have improved this book.

So, overall, I really liked some parts. This is why I gave the book a 4-star review. (Most of my book group ended up awarding only 3-stars). But, overall, after a few strong opening innings, the precision of text and purpose waned. In the beginning I did not want the book to end; by 2/3 of the way through, I was more than ready.


Profile Image for David Rubenstein.
801 reviews2,521 followers
April 19, 2013
This is a fantastic book about predictions. I enjoyed every page. The book is filled to the brim with diagrams and charts that help get the points across. The book is divided into two parts. The first part is an examination of all the ways that predictions go wrong. The second part is about how applying Bayes Theorem can make predictions go right.

The book focuses on predictions in a wide variety of topics; economics, the stock market, politics, baseball, basketball, weather, climate, earthquakes, chess, epidemics, poker, and terrorism! Each topic is covered lucidly, in sufficient detail, so that the reader gets a good grasp of the problems and issues for predictions.

There are so many fascinating insights, I can only try to convey a few. At the present time, it is impossible to predict earthquakes, that is, to state ahead of time when and where a certain magnitude earthquake will occur. But it is possible to forecast earthquakes in a probabilistic sense, using a power law. Likewise, it may be possible to forecast terrorism, because that too, follows a power law! (Well, it follows a power law in NATO countries, probably because of the efforts to combat terrorists. But in Israel, the tail of the curve falls below the power law, likely because of the stronger anti-terror emphasis there.)

The accuracy of weather predictions increases slowly but steadily, year by year. Ensembles of computer model runs are part of the story, but human judgment add value, and increases the accuracy. Weather forecasts issued by the National Weather Service are unbiased in a probabilistic sense. But weather forecasts by the TV weatherman are very strongly biased--the weatherman over-predicts precipitation by a significant amount.

Nate Silver shows that the people who are most confident are the ones that make the worst predictions. The best predictions are those that are couched in quantitative uncertainties. Silver shows how Bayes Theorem can be applied to improve predictions; it is all about probabilities. And I just love this footnote,
A conspiracy theory might be thought of as the laziest form of signal analysis. As the Harvard professor H.L. "Skip" Gates says, "Conspiracy theories are an irresistible labor-saving device in the face of complexity."

Profile Image for Ted.
515 reviews744 followers
July 12, 2017
4 ½ stars.

Nate Silver is probably best known as the statistician who confounded the “experts” by predicting the results of the 2008 and 2012 U.S. Presidential elections. As a matter of fact, his web site (https://fivethirtyeight.com/) actually did much better than the average pollsters and media with the 2016 election as well. I was following the writing on the site right up to the night of the election. Entering the final few days, 538 was giving Trump about a 1/3 chance of winning, while most others were saying that the election was a foregone conclusion. And on election day, the 538 article which pointed out early signs that Hillary could be in trouble was so accurate that I had given up for her before 10 pm that evening.





And, despite any negative impressions I may leave below about any issues I previously had with Silver's writing, or his style, the last few years, in which he's developed his own web site, together with the interactions he's had will the commenters and other statisticians that he's hired, have made his writing a model of clearness and conciseness. He also (nowadays) is very careful to refrain from making rash statements about probabilities, usually listing many reasons why the "odds" being quoted could be risky bets.

Anyway - before Silver's election triumphs he was known to a less wide, but no less fervid, audience as a sabermetrician who, starting in 2003, contributed predicted statistical ranges of performance for major league baseball players to the Baseball Prospectus.

In The Signal and the Noise, Silver discusses issues related to these foundations of his reputation in the second and third chapters. To me, the chapter on political predictions was fascinating, the chapter on baseball less so – this despite, or perhaps because of, the fact that I’ve been a keen consumer of sabermetric literature almost since Bill James brought it into the mainstream in the late 1970s.

On balance I found the book, in terms of insights offered and simple interest, much closer to the political chapter than the baseball chapter – thus the high rating.

I have to confess, however, that I certainly had my expectations lowered by Silver’s Introduction. This impressed me as an attempt (possibly at the urging of an editor?) to present a “Big Theme” context to the book which was described not only disjointedly, but in a manner that makes Silver look like a poor writer, which he isn’t at all.

The “Big Theme” that Silver talks about in the Introduction is that of Big Data inundating humankind, starting with the invention of the printing press and culminating in recent decades in the spread of powerful computers (to both hold and analyze previously unimaginable amounts of data) and the world wide web, which makes this data not merely available to almost anyone, but overwhelmingly so.

But Big Data is only briefly mentioned in the book, and is brought up again in the Conclusion in a correspondingly unenlightening manner. In fact, the book’s first and foremost theme is simply expressed in the book’s title. The difficulty in handling large amounts of data is separating the signal from the noise. The theme, expressed in this manner, is handled more or less brilliantly throughout.

Once past the Introduction, the book immediately improved. Silver seemed to quickly find his comfort level in treating one area after another in which we attempt to make predictions, with varying success. Besides the chapters on political forecasts and baseball, there are discussions of the economic meltdown of 2007-8; weather and earthquake predictions ; economic forecasts; infectious disease (flu) forecasts; gambler’s bets; top-level chess; poker; investments; climate forecasts; and terrorism.

The great majority of the chapters I found very interesting. Silver writes well, and can clearly get across his points. He shows convincingly I think how these fields differ from one another, and how the problems they have with making successful predictions and forecasts vary from field to field, depending on a variety of elements.

I approached the chapter on climate prediction with some trepidation, wondering if Silver was going to somehow take the position that it was all baloney. Thankfully no, and his conclusions about climate forecasts are along the lines of “well the forecasts of warming so far have had a rather mixed record”. So he feels there is a case to be made for some skepticism regarding the accuracy of the models, and thus of the forecasts being produced by the models. He doesn’t doubt for a moment the science involved, or the ultimate warming path we are on, but cautions against believing that we have a very good handle on how fast the warming will occur under different scenarios of additional heat trapping elements being added to the atmosphere.

But what Silver doesn’t analyze, here or anywhere else in the book, is how the aspect of risk should be accounted for in making predictions, or in acting on the predictions that we do make. I suppose this may be a bit off the track of what he’s addressing in the book. But it’s one thing to forecast the likelihood of my house burning down (very small), or of a young healthy person needing vast amounts of medical care in the next 12 months (also very small). It’s quite another to use those forecasts to conclude that in neither one case nor the other is spending money on insurance a good idea.

Most of us realize that because of the catastrophic consequences of these very unlikely events, buying insurance is rational. In the same way, it seems to me that ignoring climate change forecasts until “more evaluation” of these forecasts, and thus more fine tuning of the models, can be done, is a tremendously risky thing to do, and cannot really be rationally justified.

I’ll wind up with a brief mention of an aspect of Silver’s thinking that I found more interesting than anything else. That is his interest in, and application of, Bayesian reasoning or inference. Silver is quite obviously much taken with this, and he does a good job (in my opinion) of explaining it. He doesn’t really introduce it until his chapter on gambling, where he shows how it can be used to make probabilistic forecasts using several interesting (non-gambling) examples. In almost every chapter following this he refers to the way that Bayesian reasoning can be used to strengthen forecasting and to overcome some of the difficulties of predicting in that area.
Profile Image for Julie .
4,001 reviews58.9k followers
March 17, 2017
The Signal and the Noise by Nate Silver is a 2012 Penguin publication.

More Information, more problems-

This book was recommended by one the many books related emails I get each day. I can’t remember what the particular theme was for its recommendation, although I’m sure it had something to do with how political forecasting data could fail so miserably. Nevertheless, I must have thought it sounded interesting and placed a hold on it at the library.

Many of you may be familiar with statistician, Nate Silver. His blog/podcast, ‘fivethirtyeight’, is quite popular, featuring talks about polls, forecasting, data, and predictions about sports, and politics, and was even carried by the NYT at one point. I admit I was not familiar with his work until now. However, after reading this book, I think I will keep a closer eye on his website.
This book examines the way data is analyzed, how some predictions are correct and why some fail.

“The Signal is the truth. The noise is what distracts us from the truth.”

I’m not one to put my trust in predictions or polls. I don’t bet on sports teams, and I’m even skeptical about the weather forecast. With the polls and the media thinking they had the most recent election forecasted, I think people are warier than ever. That may be why there has been a renewed interest in this book.

The first section of the book, takes a look at the various ways experts make predictions, and how they could miss something like the financial crisis, for example.

Silver does speak to political predictions. Thinking like the ‘fox of the hedgehogs’, the biased of political polls, the media’s obsession with things the public doesn’t care about. Remember, this book was published in 2012, so, apparently, the media didn’t learn their lesson. (Silver predicted Obama’s win over Romney much to the chagrin of ‘Morning Joe’, and more accurately predicted the outcome of the most recent election, closer than most)

“The fox knows many little things, but the hedgehog knows one big thing”.

The second portion of the book is where Silver really excels: Baseball statistics.
Now, this section really appeals to baseball fans, which I am not. But, it also would appeal to those who understand math and complicated Algorithms. Again, not my thing. I tried my best to understand this section, but just could not get into it and because it was not a topic I was well versed in, much of it went over my head and frankly, it was boring to me. So, I gave up on this section and went to the next.

Weather:
This section, which deals with prediction of major weather events, such as hurricanes was very interesting. Weather forecasting not only has an effect on safety, but on our economy as well. Many times, forecasters get things right, and many lives are saved, but at times, they get in right, but things are not as bad as predicted, such as the recent blizzard expected to hit NYC. Yet, as frustrating as that may be, erring on the side caution, still might be a good thing, and remember, many weather forecasters, those working behind the scenes, are not being paid exorbitant fees. Just think about the times when you made it out of the path of a tornado, and be thankful for these guys, who must decipher an incredible amount of data and unpredictable patterns, and they must deal with the human element on top of that. Raw data doesn’t always translate well to the average consumer. For example: What does ‘bitter cold’ mean to you? But, there has to be an honesty in forecasting, too. Television ratings can come into play, too, unfortunately. This was my favorite section of the book.

Earthquake predictions, economic forecasters, sports betting/gamblers, or anyone or anything that depends on statistics, data, or formulas is examined in this book. It’s all interesting, for the most part, although, math equations and other information laid out went over my head. The author recommends Baye’s theorem, which I understood on one level, but was overwhelmed by it most of the time.

But, I did find the book fascinating, informative, and chock full calculations juxtaposed against unpredictable elements that could not be foreseen, or against patterns in plain sight, were ignored, all mix together to prove why predictions and forecast often fail, but also, what makes them work!

Although, this book centers around events taking place throughout the economic crisis, and is a point the author often refers back to, the last point in the book of ‘what you don’t know can hurt you’, reminds us that history can repeat itself, that there is always the element of improbability, the unfamiliar, the unknown, and what we can learn from it in order to make better, more informed decisions in the future.
4 stars
1 review
October 2, 2012
This book was a disappointment for me, and I feel that the time I spent reading it has been mostly wasted. I will first, however, describe what I thought is good about the book. Everything in this book is very clear and understandable. As for the content, I think that the idea of Baysean thinking is interesting and sound. The idea is that, whenever making any hypothesis (e.g. a positive mammogram is indicative of breast cancer) into a prediction (for example, that a particular woman with a positive mammogram actually has cancer), one must not forget to estimate all the following three pieces of information:

1. The general prevalence of breast cancer in population. (This is often called the "prior": how likely did you think it was that the woman had cancer before you saw the mammogram)

2. The chance of getting a positive mammogram for a woman with cancer.

3. The chance of getting a positive mammogram for a woman without cancer.

People often tend to ignore items 1 and 3 on the list, leading to very erroneous conclusions. "Bayes rule" is simply a mathematical gadget to combine these three pieces of information and output the prediction (the chance that the particular woman with a positive mammogram has cancer). There is a very detailed explanation of this online, no worse (if more technical) than the one in the book. If you'd like a less technical description, read chapter 8 of the book (but ignore the rest of it).



Now for the bad. While the Baysean idea is valuable, its description would fit in a dozen of pages, and it is certainly insufficient by itself to make good predictions about the real world. I had hoped that the book would draw on the author's experience and give an insight into how to apply this idea in the real world. It does the former, but not he latter. There are lots of examples and stories (sometimes amusing; I liked the Chess story in Chapter 9), but the stories lead the reader to few insights.

The examples only lead to one conclusion clearly. If you need to be convinced that "the art of making predictions is important, but it is easy to get wrong", read this book. If you wonder: "how can we actually make good predictions?", don't. The only answers provided are useless platitudes: for example, "it would be foolish to ignore the commonly accepted opinion of the community, but one must also be careful to not get carried away by herd mentality". While I was searching for the words to describe the book, I have found the perfect description in Chapter 12 the book itself:


Heuristics like Occam's razor ... sound sexy, but they are hard to apply.... An admonition like "The more complex you make the model the worse the forecast gets" is equivalent to saying "Never add too much salt to the recipe".... If you want to get good at forecasting, you'll need to immerse yourself in the craft and trust your own taste-buds.


Had this quote been from the introduction, and had the book given any insight into how to get beyond the platitudes, it would be the book I hoped to read. However, the quote is from the penultimate chapter, and there is no further insight inside this book.
Profile Image for Kate.
392 reviews58 followers
July 17, 2014
I'm going to do this the Nate Silver (Bayesian) way. Kind of.

Prior Probability
Initial estimate of how likely it is that I will buy Nate Silver a drink: x = 10% (This may seem high, given that he is a stranger who lives in another city, but I did rely on his blog during the past two elections, so I'd at least like to.)
New Event -- I read Nate Silver's book
Probability that I will fly to New York and track him down and thrust a drink in his hand because this was a great book and I am impressed. y = 50%
Probability that I will stay home just remember to check FiveThirtyEight more often instead. z = 30%
Posterior Probability
Revised estimate of probability that I will buy Nate Silver a drink, given that his book was illuminating and enjoyable: xy/xy + z(1-x) = 15.6%.

Feel free to check my math.
116 reviews
December 31, 2015
I wanted to like this book as I enjoy reading Silver's blog. The majority of chapters in this book are inferior rehashes of arguments and anecdotes from other authors. See Moneyball, the Information, Fortune's Formula, A Random Walk, The Theory of Poker etc. etc. The book is clearly intended to capitalize on the popularity of his 538 blog, which as John Cassidy of the New Yorker just articulated overemphasizes the use of Monte-Carlo simulations to come up with inanely precise projections of a tenth of a point of who will win the Presidential election. While heuristics and Monte-Carlo style simulations may provide details given the parameters included in the model; Silver's assumptions about the usefullness of one poll over another; and the averaging of prediction markets generally reach similar conclusions to what basic common sense would dictate. I happen to believe just as some people inevitably beat the market by looking at past historical data without actual acumen, Silver's model seems to have been successful.
The self-aggrandizing by Silver of his own skill at Poker, political forecasting, sports betting etc, seems to belie his own understanding of Bayesian theory and at times reach nauseating levels. I don't care to know his own personal income from limit poker or his player tracking system used by baseball prospectus. The books dabbles in many areas and is truly compelling in none of them. While not an awful book, a curious reader would be better served by reading separate books on area's of interest including book's that offer a stronger statistical background and less "pop culture" examples.

I do not recommend this book to anyone.
See more @

Timeisrhythm.wix.com/home
Profile Image for Amin.
333 reviews321 followers
June 5, 2018
"فارسی در ادامه"

My actual rating would be 7/10. In general, it was an interesting and insightful read, although I have mixed feelings about some of the chapters and concepts, and sometimes the pretentious tone of presenting ideas. Let's start by two weaknesses:

At some points it seems good prediction looks like a 'hammer' to see all the problems as 'needles'. So, all the problems can be interpreted as the failures of prediction. To me it does not sound very scientific (in a Popperian sense): an 'out-of-sample' situation for Silver is close to what Talib uses to explain 'antifragility'. Or the concepts of hedgehogs and foxes are interesting, but the implications are black and white, in a gray word.

Furthermore, there is too much detail and bla-blas on some of the topic such as baseball and basketball players in America, which makes the book boring or too Americanized! without a good understanding of the main points which makes some chapters very journalistic. However, it tries to highlight the importance of statistics, and the way facts less quantifiable and accessible for everyone contribute to unique predictions.

The second and the more analytical half of the book was more interesting to me. Ideas such as the changing mental model towards predicting (advantages of humans over computers and the role of the chaos theory), statistical errors in everyday life (overfitting and taking noise for signal), the necessity of creating incentives for good predictions (especially in the fields of politics and economics), complexity of predictions for resolving collective and global problems, the necessity of understanding context (and still having good theories), the story of human/machine rivalry (materialized in Kasparov/Blue Deep match), the necessity of being familiar with the basic principles of human action (to live in an uncertain world), misuse of prediction in financial markets, journalistic side effects of making prediction a popular science and the necessity of understanding the world as a gray zone (in contrast to black and white, or impossible/certain situations) are among the interesting ideas for further investigation through different chapters.

در کل اثری مفید و خواندنی بود. گرچه فصلها و جزئیات علمی و کاربردی شان با هم تفاوتهای چشمگیری داشتند. نیمه دوم و تحلیلی تر کتاب جذابیت بیشتری داشت، از این بابت که مفاهیم مهم و کاربردی را ارائه می کرد. مواردی مانند خطاهای آماری انسان در محاسبات، تفاوت یا رقابت انسان با کامپیوتر در پیش بینی، نیاز به آشنایی اولیه با علم پیش بینی در زندگی روزمره، اهمیت توجه به زمینه هر موضوع برای پیش بینی صحیح و غیره

اما دو ایراد: اول اینکه به سبک کتابهای پرفروش علمی برای عموم، مثل کتابهای گلدول و نیکولاس طالب، مفهوم اصلی کتاب که پیش بینی صحیح است مثل چکشی است که هر چیزی را میخ می بیند و راه حل اصلی را در پیش بینی صحیح برمی شمرد. از دیدگاه پوپری این رویکرد را من خیلی علمی نمی دانم و بیشتر برایم جنبه تجاری دارد. نکته دوم جزئیات فراوان و شاید غیرضروری در برخی فصول است که وجهه ای آمریکایی (مثلا در فصول مرتبط با بیسبال یا بسکتبال) به کتاب میدهد یا برای خواننده ای که خیلی به موضوع خاص فصل علاقه دارد جذابیت بیشتر دارد

جزئیاتی درباره برخی مفاهیم و فصول:

116 reviews39 followers
March 4, 2018
Another classic on statistics. This one focused more on real-life applications; sports, politics, finance, weather, climate change... I assume those who had basic statistics would enjoy it more. it was about weeding out noises from the data, and zooming in on signals which will improve the quality of the predictions. All easy say (or read) than do :)
Here is my prediction...okay more like a hunch: machine won’t be taking over the sorting task mentioned above before humans safely land on Mars. Let’s see how I did.
This was my second read of the book as part of my recent series of refreshers on statistics and data analysis. I felt I appreciated Silver's approach to the problems more this time, hence I added one star.
Profile Image for Mike Mueller.
19 reviews45 followers
April 9, 2021
I followed Nate Silver's blog (FiveThirtyEight) closely during the run-up to election day 2012. His premise was simple: grab every public poll possible, attempt to correct for pollsters' known biases, and produce a forecast based on the result. Somehow no one had thought to do this before. Silver simply crunched the numbers and nailed the outcomes in every state. Meanwhile, pundits, bloggers, and assorted blowhards made predictions based on nothing but gut feeling and partisan hackery, and they mostly missed the mark (often by a wide margin).

I was looking forward to reading more about his methodology in this book, as well as his take on the principles involved in making predictions from noisy data. In this regard, I wasn't disappointed. Silver does a good job of laying out the rules of the road:

* It's easy to mistake essentially random fluctuations for a meaningful pattern, and in some contexts (say, earthquake predictions), this can have devastating results.
* Having a well-formed, testable theory is better than just looking for any correlations you can find in your data set.
* Always make predictions and update your probability estimates like a good Bayesian. Your predictions should approach reality as you continually refine them.
* Watch out for biases in yourself and in your data set.
* Often overlooked: make sure incentives are aligned with the results you would like to achieve.

Also, some specific interesting facts:

* Making a living at poker is really hard. Without any really bad players at the table, it's nearly impossible for anyone but the top players to turn a profit.
* The efficient market hypothesis doesn't hold up to scrutiny; however, even though the stock market has discernible patterns, it may not be possible to exploit the patterns and consistently beat the market.
* Weather prediction has gotten a lot better in the last couple decades, even though most people think it hasn't.
* Both earthquakes and terrorist attacks follow a power law distribution.

If you're a stock trader, scientist, gambler, or simply someone who wants to form an accurate picture in a noisy environment, there's something in this book for you. The book is also well cited, which helps give weight to some of the more counterintuitive claims.

There was a missed opportunity to spend some time on results from the medical research industry. It's well known that publication bias and other factors result in misleadingly positive results for new treatments, which ultimately go away after independent researchers attempt (unsuccessfully) to reproduce the results. It seems like a pertinent, prototypical case of finding patterns in noise, one which could have been instructive.

A final note: Silver is not the best writer; his prose is uneven and occasionally downright awkward. His casual style works fine for a blog, but here it diminishes the impact the book could otherwise have had. This is his first published book, and it shows. There are also a couple glaring mistakes that make me think he needed a better editor.
Profile Image for Mehrsa.
2,234 reviews3,659 followers
March 28, 2019
Some interesting parts, but it's really hard to take this superforecaster seriously on political forecasting--you know what I mean? And I am sort of over the moneyball theory too. I mean, it was useful a few years ago to break free from "gut feelings", but I think the pendulum swung too far into just cold data and needs to swing back into the world of humans and fat tails and Trump getting elected.
Profile Image for Laura Noggle.
671 reviews384 followers
July 7, 2020
Meh, I was hoping for more.

Interesting at points, but the main message gets swallowed by the noise—almost too much random content.

Basically, it's hard to predict stuff. Be careful what predictions you trust, most of them will be wrong a good portion of the time.

The end.
Profile Image for Jonathan Mckay.
578 reviews53 followers
June 14, 2016
The Prior
Before reading this book, I thought there was a 70% chance I would rate this book 3 stars or higher.

The Signal
Silver's chapter on Poker was interesting both from the perspective of statistics, but also about poker tactics and the metagame. I wish this were the core of the book. Also, the explanation of Bayes' theorem was solid, as was the chapter on stocks.

The Noise
Everything else. Superforecasting is MUCH better when talking about predictions, and much more engaging. Shiller's book Irrational Exuberance is better on stocks, even Rumsfeld's biography Known and Unknown: A Memoir is better when talking about politics. It felt like Silver took a lot of shortcuts and made claims about causality in multiple areas without sufficient evidence.

The Result
Read chapters 8, 10, and 11. Skip the rest. Better yet, just skip this book and read Superforecasting.


That's 77% of the chapters that are below three stars for me. So let's run some Bayesian inference, with the hypothesis that I would give this book >= 3 stars.

P (Hypothesis given evidence) = P (Evidence given Hypothesis) * P (Hypothesis) / P (Evidence)

.27 = .3 * .7 / (.77)

Now there is only a 27% chance of >= 3 stars.
Profile Image for Rick Presley.
461 reviews12 followers
January 3, 2018
Nate Silver does an excellent job demonstrating the different domains where statistics plays a part. More importantly, he describes why methods that proved successful in one domain are inadequate or inappropriate to another domain. The best part about the book is that he doesn't resort to math to explain these differences.

The problem with the book is that he fails to take the lessons from previous chapters and apply them to subsequent chapters. I think this may have explained his hubris in mis-forecasting the 2016 election outcome. I did hear an interview with him that said his stats weren't wrong. If 2 out of 3 scenarios had Hillary winning, then 1 out of 3 scenarios had Trump winning. I think this illustrates his discussion on the difference between likelihood and probability.

I would recommend this as a primer on stats for the non-mathematician, but I would caution that there are sprawling passages of boring stuff that you'll want to skip over.
Profile Image for Cameron.
278 reviews9 followers
July 29, 2013
This is a really amazing book - a must read for anyone who makes decisions or judgement calls. Even before I had finished the book it caused me to look at some of the assumptions and bad forecasts I was making as well as recognising "patterns" as noise.

There is nothing "new" in this book, just well established and solid methods applied well and explained very coherently. The writing is excellent, the graphics helpful and the type not too small. There are plenty of footnotes (relevant to the page), but I didn't bother with the references at the back. All up it was not at all the onerous read I was expecting from the size and nature of the book.

What I particularly liked was that it agrees with many of my "hunches" and "gut feels" (that seem to work out mostly) but more importantly puts theory that I can put to the tests and use more widely.

A few points raised really made me feel chuffed and not alone (a little cleverer than most): The misuse and misapplication of Occam's razor; Overfit of models onto data; Fisherian statistical significance (particularly in medical science).

There was only one "low" point; chapter 11 on free markets, "If you can't beat'em...", kind of got off course. It started out as a slightly irked, though legitimate, response to a smart ass comment about a free market betting pool being a better predictor than his 538 website. It then went into stock market trading and but didn't go far enough into the information inequalities with market making for my liking. The end conclusion (two streams - indexed investment on signal trading and short trading on the noise), I agree with.


A final point on my bad predictions: of the last 4 books I have read I have judged reading time and effort on size and been wrong 3 times - twice with small novels that were philosophically challenging and unpleasant to read and once with this behemoth of a book that was breeze to read!
Profile Image for Mal Warwick.
Author 28 books394 followers
April 6, 2017
An eminently readable book about how experts make sense of the world (or, more often, don’t)



Statisticians rarely become superstars, but Nate Silver is getting close. This is the guy who writes the FiveThirtyEight.com blog for the New York Times and has correctly predicted the outcome of the last two presidential elections in virtually every one of the 50 states. But Silver is no political maven weaned on election trivia at his parents’ dinner table: he earned his stripes as a prognosticator supporting himself on Internet poker and going Billy Beane of the Oakland A’s (Moneyball) one better by developing an even more sophisticated statistical analysis of what it takes to win major league baseball games. And, by the way: Silver is just 34 years old as I write this post.

The Signal and the Noise is Silver’s first book, and what a book it is! As you might expect from this gifted enfant terrible, the book is as ambitious as it is digestible. Written in an easy, conversational style, The Signal and the Noise explores the ins and outs of predicting outcomes not just in politics, poker, and sports (baseball and basketball) as well as the stock market, the economy, and the 2008 financial meltdown, weather forecasting, earthquakes, epidemic disease, chess, climate change, and terrorism.

Fundamentally, The Signal and the Noise is about the information glut we’re all drowning in now and how an educated person can make a little more sense out of it. As Silver notes, “The instinctual shortcut we take when we have ‘too much information’ is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies of the rest.” What else could explain why Mitt Romney was “shell-shocked” and Karl Rove was astonished by Romney’s loss in a presidential election that every dispassionate observer knew was going Obama’s way?

Silver asserts that “our predictions may be more prone to failure in the era of Big Data. As there is an exponential increase in the amount of available information, there is likewise an exponential increase in the number of hypotheses to investigate . . . But the number of meaningful relationships in the data . . . is orders of magnitude smaller. Nor is it likely to be increasing at nearly so fast a rate as the information itself; there isn’t any more truth in the world than there was before the Internet or the printing press. Most of the data is just noise, as most of the universe is filled with empty space.”

Sadly, it’s not just in politics that bias clouds judgment and leads to erroneous conclusions. “In 2005, an Athens-raised medical researcher named John P. Ioannidis published a controversial paper titled ‘Why Most Published Research Findings Are False.’ The paper studied positive findings documented in peer-reviewed journals: descriptions of successful predictions of medical hypotheses carried out in laboratory experiments. It concluded that most of these findings were likely to fail when applied in the real world. Bayer Laboratories recently confirmed Ioannidis’s hypothesis. They could not replicate about two-thirds of the positive findings claimed in medical journals when they attempted the experiments themselves.”

In general, Silver’s thesis runs, “We need to stop, and admit it: we have a prediction problem. We love to predict things — and we aren’t very good at it. . . We focus on those signals that tell a story about the world as we would like it to be, not how it really is. We ignore the risks that are hardest to measure, even when they pose the greatest threats to our well-being. We make approximations and assumptions about the world that are much cruder than we realize. We abhor uncertainty, even when it is an irreducible part of the problem we are trying to solve.”

There’s more: Silver relates the work of a UC Berkeley psychology and political science professor named Philip Tetlock, who categorizes experts as either foxes or hedgehogs (in deference to an ancient Greek poet who wrote, “The fox knows many little things, but the hedgehog knows one big thing.”). Hedgehogs traffic in Big Ideas and often hew to ideologies; these are the people who talk to the press and are frequently found on TV talk shows. Foxes are cautious types who carefully examine and weigh details before reaching conclusions. Not surprisingly, Tetlock found that “The more interviews that an expert had done with the press . . . the worse his predictions tended to be.”

In other words, Be afraid. Be very afraid. If the people who supposedly know what they’re talking about often really don’t, how can the rest of us figure out what’s going on?
Profile Image for Lightreads.
641 reviews523 followers
December 15, 2012
Eh, underwhelmed. A survey of prediction and predictive tools, starting with failures and moving on to successes. Nothing particularly new or interesting here, and I think Silver knew it. It’s not like the premise that the strength of a prediction depends on the accuracy of the data is revelatory or anything. A lot of survey nonfiction like this can be saved with interesting collateral content. This book tours over a dozen topics, but I didn’t find much new or compelling or even particularly complex in the subjects I know something about (the efficient market hypothesis, political polling, the spread of infectious disease), and more damningly I was never engaged by his writing on subjects I don’t know much about (the weather, sports betting, baseball. Oh my God, so much baseball.)

I guess what I’m saying here is that the book format reveals all of Silver’s weaknesses as a writer, and there are many. The nicest thing you can say is that when he’s really on a roll, he’s workmanlike. And that’s okay! He doesn’t have to write brilliantly, he can just keep doing statistical modeling. (Better him than me – I disliked stats so much, it doesn't actually qualify as math in my head.) Just, turns out I prefer him doing stats in 1000 word articles and in person, where he comes across much better.
Profile Image for Atila Iamarino.
411 reviews4,363 followers
March 10, 2013
Gostei bastante. Uma boa explicação sobre probabilidades, predições e estatística. Bastante Bayesiano e bem descrito.
Profile Image for Jeanette.
3,216 reviews551 followers
August 11, 2022
Ok read. Some of the examples were 4 stars. Especially the baseball and medicine ones.

Stats can be used to prove or disprove almost anything in PAST occurrences or in future ones. So I do not think this got the depth of what you can do or mislead with emphasis of some stats or partial graphic reveals. Many other reporting small scales of much larger real world dependable factors also.

Research itself is always immensely colored in outcomes and proofs by the factors who pay for its existence.

So overall, I don't think this began to cover how wrong prediction, forecast, outcomes can be.
Profile Image for Brian Clegg.
Author 185 books2,513 followers
September 29, 2012
It was really interesting coming to this book soon after reading The Black Swan, as in some ways they cover similar ground – but take a very different approach. I ought to say straight away that this book is too long at a wrist-busting 534 pages, but on the whole it is much better than its rival. Where Black Swan is written in a highly self-indulgent fashion, telling us far too much about the author and really only containing one significant piece of information, Signal and Noise has much more content. (Strangely, the biggest omission is properly covering Taleb’s black swan concept.)

What we’re dealing with is a book about forecasting, randomness, probability and chance. You will find plenty about all the interesting stuff – weather forecasting, the stock market, climate change, political forecasts and more, and with the exception of one chapter which I will come back to in a moment it is very readable and well-written (though inevitably takes a long time to get through). It has one of the best explanations of Bayes’ theorem I’ve ever seen in a popular science book, and (properly to my mind) makes significant use of Bayesian statistics.

What’s not to like? Well, frankly, if you aren’t American, you might find it more than a trifle parochial. There is a huge section on baseball and predicting baseball results that is unlikely to mean anything to the vast majority of the world’s readers. I’m afraid I had to skip chunks of that. And there’s a bizarre chapter about terrorism. I have two problems with this. One is the fawning approach to Donald Rumsfeld. Nate Silver seems so thrilled Rumsfeld gives him an interview that he treats his every word as sheer gold. Unfortunately, he seems to miss that for much of the world, Rumsfeld is hardly highly regarded (that parochialism again).

There is also a moment where Silver falls for one of the traps he points out that it’s easy to succumb to in analyzing data. On one subject he cherry picks information to present the picture he wants. He contrasts the distribution of deaths in terrorist attacks in the US and Israel, pointing out that where the US numbers follow a rough power law, deaths in Israel tail off before 100 people killed in an incident, which he puts down to their approach to security. What he fails to point out is that this is also true of pretty well every European country, none of which have Israeli-style security.

I also couldn’t help point out one of the funniest typos I have ever seen. He quotes physicist Richard Rood as saying ‘At NASA, I finally realised that the definition of rocket science is using relatively simple psychics to solve complex problems.’ Love it Bring on the simple psychics.

Overall, despite a few issues it was a good read with a lot of meat on probability and forecasting and a good introduction to the basics of Bayesian statistics thrown in. Recommended.

Review first published on www.popularscience.co.uk and reproduced with permission
Profile Image for Gumble's Yard - Golden Reviewer.
1,747 reviews1,200 followers
January 31, 2019
Book about prediction by the author of the 538 political blog, which became particularly famous in the 2012 presidential election (after the book was written) due to the author's high confidence in an Obama victory due to polling evidence in marginals. The author was prior to 538 spread over two jobs - online poker (until it was made illegal in US - see below) and baseball stat evaluation (where he developed his own site which he sold to a professional site for which he then worked).

The book's central themes are the importance of Bayesian stats (as opposed to Fisher type confidence intervals based only on data) as the optimal blend of expertise and data and the difficulty of distinguishing the true signal from underlying noise which can either obscure the signal or create false ones. He continues various areas in turn - all of which have their own forecasting issues, which are often very different leading to his third point the difficulty of drawing hard and fast rules around prediction. Generally an interesting book – more a compendium of ideas and so lacking the really big idea/takeaway – which seems deliberate due to the last point.

In respect of the financial crisis, he identifies various failures of prediction (housing bubble, rating agencies, failure to see how it would cause a global financial crisis, failure to realise how big and deep recession would be) which he largely ascribes to over-confidence and inability to forecast out of sample events. In political forecasting he claims his ability think probabilistically, revisit and alter past forecasts and look for data consensus means he outperforms what is a poor level of competition (biased and unscientific political pundits). For baseball again he initially competed against simple rules of thumb but sees the real skill in continuing to combine the best of stats with properly incorporated qualitative information to continue to look for edges. Weather forecasting he sees as largely a success story especially when you account for bias (for example to over-predict bad weather as that is less catastrophic an error) and allowing for chaos theory which makes precise long range forecasts difficult. Earthquake forecasting by contrast has had almost no success (here he talks about over fitting). For economic forecasting there are lots of challenges (Uncertainty principle type ideas such as Goodhart’s law, self-fulfilling prophecies so that talk of a recession causes one, natural biases of commentators including either not wanting to go away from herd or being deliberately provocative) not least the sheer noisiness of economic data. For infectious diseases he discusses self-cancelling prophecies (epidemic warnings change behaviour in a good way) and although it’s a challenging area he believes practitioners in this field (perhaps due to their Hippocratic oaths) are more thoughtful about their predictions. In chess he discusses in detail the psychology of Kasparov’s defeat by a computer – an error it made in a losing position convinced him it could think more deeply than it could as well as where humans are better or worse than computers and how blended programmes are very strong. For Poker he takes the view that the Poker players are very natural Bayesians, adjusting their knowledge both as cards appear and also assessing chance of different hands by an intuitive posterior analysis based on how they think their opponents would act with different hands. He also takes the view that he standard of opponents is key to if you can make money. For stock picking he discussed the efficient market hypothesis (especially with transaction costs) and the psychology of bubbles. For climate change he discusses healthy scepticism and also his conclusion that scientists are a lot more seekers after the truth than politicians. For terrorist attacks he discussed power laws to extrapolate to major attacks (which actually dominate costs and deaths) and the importance of lateral and imaginative thinking around threats.
Profile Image for Gina.
536 reviews24 followers
May 20, 2013
Reading Nate Silver is like exhaling after holding your breath for a really long time. I found FiveThirtyEight back in the primary days of 2008, when it was Hillary and Barack fighting it out, and it became apparent that not one of Hillary's advisers to whom she was presumably paying lots and lots of money were as smart or observant as Nate Silver (or Obama's advisers). One of my favorite tweets ever (I don't read many tweets) came from Ken Jennings on election morning of 2012, something along the lines of "Obama could still lose this thing if too many democrats write in Nate Silver with little hearts drawn around his name." He had Obama with a 90% chance of winning. And while you could find plenty of other people calling it for Romney or Obama, they are for the most part just talking heads that don't actually care about reality. When Nate Silver gives you a 90% chance of something, it means that nine times out of ten it is going to happen, and one time out of ten it won't, nothing more and nothing less. You don't have to spend energy paying attention to which station it is on and who he is catering to. He caters to reality, which is surprisingly novel. Finding someone who can do this feels like, as I said, exhaling.

Of course he has biases, etc, but his job is to be aware of them. This whole book is about why making accurate predictions is extraordinarily difficult. Sometimes apparently impossible, as in the cases of trying to beat the stock market over the long term or predict earthquakes. Sometimes made extremely difficult by humans' strong tendency to not accept the truth of things that don't serve our ends, as in the case of the financial collapse of 2008 (which first chapter in this book is the absolute best summary of that whole fiasco I have ever read). Sometimes the message of people willing and able to make careful, thoughtful predictions with honest margins of error, as is the case with many climate scientists in relation to global warming, is hijacked by politics and agendas. (The chapter on climate change was also exceptionally good, and the people who are criticizing Silver for being a climate change denier or for giving legitimacy to deniers' views have very poor reading comprehension and/or are so blinded by their own religious belief in their version of climate change that they cannot accept the reality of how hard it is to make accurate predictions.) The chapter on his era as a successful online poker player was very entertaining and reinforced why I do not have the stomach to be a gambler.

All that being said, be forewarned that most people will find this book extremely boring. It is in the vein of Malcolm Gladwell, but about three times as long and dense (and therefore more substantial). Also, I sadly did not feel like I had gained a very deep understanding of Bayesian thinking by the end, which is unfortunate since that is one of the main points of the book. Surely that is partly my fault, but he could have been more clear about it.

At any rate, I think the chapters on the financial collapse and global warming should be required reading for everyone, and the rest of it for those who are interested.
Profile Image for Daisy.
180 reviews60 followers
June 12, 2021
I always found probability one of the most interesting branches of maths and so I found this book interesting as it is essentially about statistics and probability.
We live in a world of data, data that is easily collected and easily computed by supercomputers that can reel off millions of calculations a second, but in my experience there are few people that know how to interpret the data and therefore make good use of it. A very small example was a headteacher that was preoccupied with all the teachers keeping very detailed data on each child, down to specifics such as can use a semi colon in their writing. She ran reports on attainment, trends etc and when the Year 6 class did not perform as well as she'd hoped she dug out the Y6 teaching team. What the team pointed out to her was the data showed that every year had shown a good rate of progress except Year 3 where attainment took a sharp decline and every year after that attainment increased but never recovered from that dip. Having all the data in the world is no help if you just run with what your instinctive belief tells you. I feel the current covid response is the same, we are told that all decisions are based on the data but just a superficial look at the data tells you that it is not entirely the data that is informing the rules.
I cite these examples because the thrust of Silver's book is that there needs to be a symbiosis between the data and human interpretation of it. Reassuringly Silver states that despite IBM's huge weather supercomputer, human input in the process of forecasting still improves the accuracy by 25% (which is the percentage it has always improved accuracy by regardless of the computer's power) and that the talent scouts are better predictors of baseball talent than a statistics based program.
One of the observations he makes is obvious to anyone who has ever entered the mud fight that is twitter. He says that the more information available to people the more entrenched they become in their belief and the less willing to consider other points of view. We enjoy being in an echo chamber with circulating facts that bolster our initial 'gut' belief. In some way we are all becoming hedgehogs; mining a deep vein of specific opinions and views rather than the fox who roams picking up bits here and there from a variety of sources. Foxes are more successful at predicting but the hedgehogs, because of their certainty, get more airtime.
In summation an interesting book that looks at society as being somewhat like the Pygmalion, we created something which we are now in awe of and treat as a god. Silver tells us it is time to up our game in the data stakes and do what we are good at and then we may become better predictors than we thought possible.
Profile Image for Dave.
414 reviews10 followers
January 20, 2013
Silver's gone 99 for 100 on predicting the state winners of the last two presidential elections. Here he goes something like 7 for 13, very good in parts, solid in some, and misfires in others. It's well-researched, mostly objective (but by no means totally), but it rarely covers anything I didn't already know. If you've read Michael Lewis's The Big Short and Moneyball you can skip chapters 1 and 3 and if you've ever had a class that proves pundits are not any more accurate forecasters than the population at large you can skip chapter 2. In addition, Silver loses his way with the climate change chapter as subjectivity overcomes math and the piece covering his online poker career in lifeless, as I expect it would be for anyone who's not a fan of the game.

Silver's at his best covering the weather (temperature predictions and hurricane landfall site predictions have decreased their margin of error by significant margins in the last few decades; trust the National Weather Service and not your local newscaster for the most accurate forecast), earthquakes (impossible to predict), and the Bayes theorem, which he champions as the best model by which to life your life and conduct your business.

As we learn that it's nearly impossible to beat the stock market over the long run without the benefit of inside information, it becomes clear that the best thing a reader with sound statistical analysis ability can take away from this book, other than making the Bayes theorem a default operating method, is to take that skill and apply it where the analysis to this point is weak. The stock market, baseball, poker - they've been covered, but if you can separate the signal from the noise as the availability of big data overwhelms our ability to parse the useful pieces from it then you can gain a competitive edge in your industry. It's good advice and there are some solid parts of the book, but for such a successful guy there was not much groundbreaking material here. If I weren't a completist I would have read only the chapters that started going somewhere in the first few pages, as the correlation between the first five pages was .92. The exception is the chapter on chess, which was fast out the gate, but faded down the stretch, especially as Silver ignored the fact that Kasparov's loss to Deep Blue was in part triggered by the unfairness of the latter's team getting to see the former's recent matches, but not the other way around. So,yes, Silver's political forecasting is exceedingly accurate and his writing is hit or miss.
Profile Image for Patrick Brown.
141 reviews2,454 followers
October 22, 2012
This was a fun read that tickled the nonfiction part of my brain in pleasant ways. It felt a bit repetitive in parts, and I found myself wondering how various chapters (such as the chess chapter) related to the whole. In the end, I'll take from this book the need to think probabilistically in life, and Bayes' theorem, about which I knew little. The chapter on terrorism was an excellent ending to the book, as it not only tied the concepts together, but it also made apparent the stakes in predicting the future. The McLaughlin Group, for instance, gets to keep coming back each week, even though their predictions are laughably bad. When you're trying to guess whether a terrorist might nuke New York...well, you kind of have to be more right about that.

Still, I'm not sure this book quite added up to the sum of its parts. For instance, after reading about the super-skilled sports gambler, I didn't have any better idea how he did what he did than I had before reading the chapter. Perhaps he wouldn't tell Silver his secrets, I don't know. I doubt my predictions will get much better from having read this book, either (though I wonder whether that was the goal of the book or now). I'd still recommend it to anyone with a love of charts, a thirst for interesting data-driven nonfiction, or anyone looking for something to shake up their reading list with something a little different.
Profile Image for Susan Visser.
516 reviews4 followers
November 2, 2012
I really enjoyed the book, Nate's talk, and meeting him in person. The book is about predictions and goes through many world events that we can all relate to and discusses the signals and noise that went on around these events.

You'll recognize the 2008 US election, the large earthquakes, especially in Japan, swine flu, both the one in the 70s and the more recent epidemic, economic meltdowns, 911, Pearl Harbour, stock market fluctuations, and much more. Throughout these stories, we learn about what the predictions were and why they failed or succeeded. Nate gives advice on how the predictions can be improved in these particular incidents, but gives the reader advice on how to create accurate predictions in similar situations.

One of the most amazing things you'll learn in the book is that weather predictions is one of the best success stories. Most of us think that weather forecasters are the worst at their jobs, but we're not thinking about probability as we should.

You'll learn about Bayes theorem of probability and how to use it in fun things like winning at poker!

I enjoyed the book very much and encourage you to read it!
Profile Image for Ms.pegasus.
695 reviews130 followers
April 20, 2014
Yes, this book is by that guy — Nate Silver who correctly predicted the winner of the 2008 presidential elections in 49 out of 50 states. That might seem off-putting. The credentials portend a heavy tome on statistics. Those fears are quickly allayed. This book is entertaining as well as informative.

Silver offers solace to those frustrated by information overload. Over-simplification on the one hand and brute-force data crunching on the other can both lead to serious errors. Of the latter he writes: “The numbers have no way of speaking for themselves. We speak for them. We imbue them with meaning. ...Data-driven predictions can succeed – and they can fail. It is when we deny our role in the process that the odds of failure rise. Before we demand more of our data, we need to demand more of ourselves.”

This is a book that provides a context as well as explanation for something called Bayes's Hypothesis. Silver begins by considering the many recent instances of blatantly failed prediction. These include the 2008 housing bubble, the collapse of the Soviet Union, and the Fukushima disaster. In all of these examples he probes the multiple reasons behind human error. Among these is our very human imperative to interpret through patterns. Vision and taste, for example, are perceptions derived from the brain's ability to discern pattern. In a similar way, we try to make sense of events affecting our lives. Unfortunately, all too often, we are unable to separate significant data from insignificant data. In the data-rich field of economic forecasting, it's all too easy to develop models that overfit the data, accounting for insignificant and significant data points indiscriminately. A dense layer of possibly random correlations is captured in a convoluted skein of calculations fed into a computer to generate a “pattern”: “The wide array of statistical methods available to researchers enables them to be no less fanciful – and no more scientific—than a child finding animal patterns in clouds.”

A second major source of error is emotion. Experts are frequently wrong because they simply don't want to look bad. He cites the participants of the McLaughlin Group. An outlandish prediction which proves true will be remembered. If it's false, people tend to forget. There is a built-in incentive to grandstand, making outlandish predictions. Scholars may have the opposite incentive: It's safer to stay within the consensus rather than risk looking foolish. Silver also points out another dichotomy. Some experts are so wedded to a pet theory or model that they are incapable of recognizing contradictory data. He characterizes such people as hedgehogs; their opposite are the nimble minded foxes, always seeking out new information and willing to try out new frameworks for fit. Finally, he cites an innate tendency to ignore frightening signals. “Human beings have an extraordinary capacity to ignore risks that threaten their livelihood, as though this will make them go away.”

Along the way, he redefines the problem of forecasting in today's world. We live in a world of complex and dynamic systems. A promising forecasting model must allow for adjustment through feedback. Context is always important to separate independent from dependent data points. For example, during the housing bubble, the rating agencies did not recognize that the playing field for issuing mortgages had shifted drastically. The assumption that each mortgage default within a given tranche was independent was the basis for their overly optimistic credit ratings. A corollary of this is that qualitative information must be included in the forecasting process. The problem then becomes how to quantify qualitative data. Finally, we live in a world of uncertainty. Failing to include uncertainty in forecasting calculations is a form of denial. In other words, there is a lot of noise and a sparsity of signal. How can uncertainty be expressed and used in the forecasting process?

It cannot fail to astonish most readers that Silver cites weather forecasting as one of the more successful efforts in forecasting. First, meteorologists work with hypotheses that describe how weather systems work. Second, there is an enormous amount of data. Third, the models are constantly being improved as new data either affirms or disproves the latest prediction.

Silver also discusses a technique called agent-based modeling, used to predict the spread of epidemics. Incorporated into the model is a sim-city of human behavior parsed by demographic details down to the minutest level.

In Chapter 8 Silver finally introduces Baye's Theorem. In addition to his own examples, he uses the classic example of how the rate of false positives in a sample of mammograms affects the actual probability that a positive test accurately predicts the presence of cancer. Rather than repeat the explanation here, I have added some useful websites in the notes section. (The reason I do this is that the more ways a math problem is explained, the likelier it is that understanding will eventually come. I admit it. I didn't understand the formula itself until I had worked through several of these alternative explanations. My favorite is the one that used decision trees).

It's amusing that Silver chooses as his first example a scenario in which a woman finds a stranger's underpants in her husband's bed. Using Bayes's Theorem, he gets the probability down from 50% to only 29%! Imagine the beleaguered husband giving this explanation to his wife!

Bayes's Theorem is all about conditional probabilities: There is an assumed prior probability, and a resulting posterior probability. The general idea is that even if the prior probability is a wild guess, it will be refined by repeated recalculation of the formula by applying new data successively. The result isn't a prediction – it's only a probability that a proposition is true. It's a technique for modulating new data to align its importance with older data. It's a reminder that uncertainty arises not just from the numbers we collect, but from the innate complexity of the events we are attempting to study. The method is contrasted to the more familiar bell-shaped curve assumptions of frequentism.

Silver's varied interests are reflected in this book. He provides examples from Kasparov's chess match with Big Blue, and an interview on poker strategy with Tom Dwan. These examples serve to illustrate the dynamic properties of applying Bayes's Theorem. Anyone interested in either of these areas should definitely take a look at Silver's commentary.

Will this book leave you an expert on Bayesian Theory? By no means. The book is designed to whet your appetite. Silver concludes with the final consolation: “Prediction is difficult for us for the same reason that it is so important: it is where objective and subjective reality intersect.”

NOTES:
Silver's formulation of Bayes's Theorem: (Prior Probability x Probability of specified event) / (Prior Probability x Probability of specified event) + (Probability of specified event being not true) x (1 - Prior Probability).
Additional websites that explain Bayes's Theorem:
https://www.youtube.com/watch?v=aGnVj...
This is a video explanation using a decision tree
https://www.youtube.com/watch?v=E4rlJ...
This is a classroom video which includes a decision tree explanation
http://betterexplained.com/articles/a...
This is a really detailed text explanation covering Bayes' Theorem step-by-step with interactive calculation boxes.
Displaying 1 - 30 of 3,278 reviews

Can't find what you're looking for?

Get help and learn more about the design.