Goodreads helps you keep track of books you want to read.
Start by marking “Expert Political Judgment: How Good Is It? How Can We Know?” as Want to Read:
Expert Political Judgment: How Good Is It? How Can We Know?
Enlarge cover
Rate this book
Clear rating

Expert Political Judgment: How Good Is It? How Can We Know?

4.21 of 5 stars 4.21  ·  rating details  ·  134 ratings  ·  19 reviews

The intelligence failures surrounding the invasion of Iraq dramatically illustrate the necessity of developing standards for evaluating expert opinion. This book fills that need. Here, Philip E. Tetlock explores what constitutes good judgment in predicting future events, and looks at why experts are often wrong in their forecasts.
Tetlock first discusses arguments about wh
Paperback, 321 pages
Published August 20th 2006 by Princeton University Press (first published July 5th 2005)
more details... edit details

Friend Reviews

To see what your friends thought of this book, please sign up.

Reader Q&A

To ask other readers questions about Expert Political Judgment, please sign up.

Be the first to ask a question about Expert Political Judgment

Collaborative Intelligence by J. Richard HackmanHow to Measure Anything by Douglas W. HubbardA Tradecraft Primer by United StatesExpert Political Judgment by Philip E. TetlockPsychology of Intelligence Analysis by Richard J. Heuer Jr.
Best books for intelligence analysts
4th out of 12 books — 4 voters
Animal Farm by George OrwellThe Omnivore's Dilemma by Michael PollanThe Shock Doctrine by Naomi Klein1984 by George OrwellA People's History of the United States by Howard Zinn
Disturbing Truths
123rd out of 213 books — 176 voters

More lists with this book...

Community Reviews

(showing 1-30 of 851)
filter  |  sort: default (?)  |  rating details
This a fantastic data based exploration of just how little political pundits actually know. And in fact the more media exposed, the more single view of the world they possess, the less accurate are their political forecasts. Philip Tetlock over 20 years persuaded political experts to make predictions on a wide variety of topics, only to find that most experts were less reliable than a chimp picking options via a dart board. He used Isaiah Berlin's wonderful distinction between the Hedgehog that ...more
Billie Pritchett
Philip Tetlock's book Expert Political Judgment wants to know something very simple that is very difficult to find out. Through research, Tetlock wants to know how people can make good predictions about big social, economic, and political issues. For example: Is it possible for an expert to have predicted the collapse of the Soviet Union? Did anyone predict the collapse? What kinds of knowledge would an expert have to have to predict something like that?

After a long and detailed study, he discov
Adam S. Rust
An ambitious and thought-provoking study on the value and reliability of experts in the field of politics and the economy. Starting in the 1980s political scientist Philip Tetlock interviewed experts seeking their predictions on the outcomes various future events (such as Gorbachev's interest in reform, the ascendency of Japanese economic power, etc.).

The conclusions drawn from the outcomes of these expert predictions were bleak, experts are frequently wrong and almost consistently underperform
Devin Partlow
At first glance you'd think, "Awesome a book that will help to choose which political experts I should put my faith in". But then you'd have to remember that this big scientific experiment which didn't take influence into account. If a prominent figure predicts that something is going to happen, that prediction is going to influence the outcome.

If life could be neatly controlled like simulated lab environments, the results of these social experiments would hold weight, but unfortunately that's
Pete Welter
Political experts and pundits share their predictions every day via think thanks, lectures and often through the media. Their qualifications are almost invariably impeccable. However, they often contradict each other, so we know they can't always be right. The question this answers are: "how accurate are they?" and, if some experts are more accurate than others, "what are the characteristics of the most accurate forecasters?" To answer these questions, Tetlock did a long term study with hundreds ...more
Wai Yip Tung
This is Philip Tetlock's ambitious study to measure and quantify expert's judgment over political events. The main finding is expert sucks. They are hardly better than chimps. But still he discover there are two category of people with different cognitive style that matters. The "hedgehogs" know one big thing and show much confidence in his grand theory to expand and predict things. The "foxes" know many small things, uses diverse source of information for forecasting rather and are more open to ...more
This book reports on a research project to understand the bases behind expert political judgment. What does it mean to make such judgments and how do we determine the quality of such judgments -- or the "track record" of those experts making the judgments. This is a hard question to address. Quality judgment is not just about whether some prediction comes true or not. It is not just about simple forecasting. It is not about simple topics, such as whether a make of car will be reliable, but conce ...more
Nick Harris
Not the easiest read, some of the chapters are very heavy on the theory, methodology and statistics. Which means this books will never ever win awards. Unless it does. Perhaps
Michelle Tran
The study and methodology to quantify forecasting judgements was interesting, but the academic verbiage was somewhat distracting.
Patrick Bair
An interesting study in the science of judging judgment; but I must admit, my inner statistician was way more than satisfied.
The best predictor of good judgment is the extent to which people think about thinking.

Interesting book. Feeds into my confirmation bias about the importance of metacognition (and the fact that I am probably a 'fox').

Foxes tend to fall short when presented with a wide range of scenarios (overestimate probabilities of events when imagining many possible scenarios).
Robb Seaton
I'm confused as to the purpose of this book. It starts with data -- formal models consistently outperform human judgment -- and then spends the rest of the book deconstructing what separates terrible human judgment from bad human judgment. Shouldn't we be talking about, you know, model building?
Excellent. It changed the way I think about people: foxes and hedgehogs are part of my internal vocabulary now. The author's conclusions were well documented, and though it is a bit dry, scholarly, and repetitive in parts, I appreciated those choices.
Oct 26, 2011 Jen marked it as potential-reads
Heard Tetlock's google author's talk about Foxes (better forecasters, not married to ideology) and Hedgehogs (love big organizing principals). He is running the Good Judgment Project.
Fascinating book where the author holds "experts" in social science to account. I like it because it favors ideological flexibility instead of rigidly clinging to one theory.
I'f you're going to read this book, which I strongly suggest you do, make sure you have a working knowledge of stats before starting.
Everyone should read this book. Now.
Matthew marked it as to-read
Feb 25, 2015
Ashish marked it as to-read
Feb 25, 2015
Amanda marked it as to-read
Feb 24, 2015
Corey Justus
Corey Justus marked it as to-read
Feb 23, 2015
Aurel added it
Feb 23, 2015
Maria marked it as to-read
Feb 22, 2015
Ujjual marked it as to-read
Feb 20, 2015
Pamela marked it as to-read
Feb 20, 2015
Alex You
Alex You marked it as to-read
Feb 18, 2015
« previous 1 3 4 5 6 7 8 9 28 29 next »
There are no discussion topics on this book yet. Be the first to start one »
  • Why Everyone (Else) Is a Hypocrite: Evolution and the Modular Mind
  • Uncontrolled: The Surprising Payoff of Trial-and-Error for Business, Politics, and Society
  • The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy
  • The Future of Power
  • Heuristics and Biases: The Psychology of Intuitive Judgment
  • Future Babble: Why Expert Predictions Fail - and Why We Believe Them Anyway
  • The Winner's Curse: Paradoxes and Anomalies of Economic Life
  • The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies
  • Calculated Risks: How to Know When Numbers Deceive You
  • Why Leaders Lie: The Truth about Lying in International Politics
  • The Predictioneer's Game: Using the Logic of Brazen Self-Interest to See and Shape the Future
  • Day of Empire: How Hyperpowers Rise to Global Dominance--and Why They Fall
  • Exit, Voice, and Loyalty: Responses to Decline in Firms, Organizations, and States
  • Critical Mass: How One Thing Leads to Another
  • The Republican Brain: The Science of Why They Deny Science--and Reality
  • Judgment Under Uncertainty: Heuristics and Biases
  • Micromotives and Macrobehavior
  • Psychology of Intelligence Analysis
Counterfactual Thought Experiments in World Politics: Logical, Methodological, and Psychological Perspectives Behavior, Society, and Nuclear War: Volume II Behavior, Society, and Nuclear War: Volume I Unmaking the West: "What-If?" Scenarios That Rewrite World History Learning in U.S. and Soviet Foreign Policy

Share This Book