This study of perception and misperception in foreign policy was a landmark in the application of cognitive psychology to political decision making. The New York Times called it, in an article published nearly ten years after the book's appearance, "the seminal statement of principles underlying political psychology."
The perspective established by Jervis remains an important counterpoint to structural explanations of international politics, and from it has developed a large literature on the psychology of leaders and the problems of decision making under conditions of incomplete information, stress, and cognitive bias.
Jervis begins by describing the process of perception (for example, how decision makers learn from history) and then explores common forms of misperception (such as overestimating one's influence). Finally, he tests his ideas through a number of important events in international relations from nineteenth- and twentieth-century European history.
In a contemporary application of Jervis's ideas, some argue that Saddam Hussein invaded Kuwait in 1990 in part because he misread the signals of American leaders with regard to the independence of Kuwait. Also, leaders of the United States and Iraq in the run-up to the most recent Gulf War might have been operating under cognitive biases that made them value certain kinds of information more than others, whether or not the information was true. Jervis proved that, once a leader believed something, that perception would influence the way the leader perceived all other relevant information.
Robert Jervis is the Adlai E. Stevenson Professor of International Affairs at Columbia University, and has been a member of the faculty since 1980. Jervis was the recipient of the 1990 University of Louisville Grawemeyer Award for Ideas Improving World Order. Jervis is co-editor of the Cornell Studies in Security Affairs, a series published by Cornell University Press, and the member of numerous editorial review boards for scholarly journals.
While Jervis is perhaps best known for two books in his early career, he also wrote System Effects: Complexity in Political and Social Life (Princeton, 1997). With System Effects, Jervis established himself as a social scientist as well as an expert in international politics. Many of his latest writings are about the Bush doctrine, of which he is very critical. Jervis is a member of the American Association for the Advancement of Science and the American Academy of Arts and Sciences. In 2006 he was awarded the NAS Award for Behavior Research Relevant to the Prevention of Nuclear War from the National Academy of Sciences. He participated in the 2010 Hertog Global Strategy Initiative, a high-level research program on nuclear proliferation. He was also president of the American Political Science Association in 2001.
As an aspiring military strategist, this is a classic that I'm supposed to like. If a reader can make the laborious slog through 400+ pages of dense, wooden prose, he or she will understand why. The subject is vitally important: how the perceptions and images held by leaders influence their judgments and decisions, and why these perceptions and images can be profoundly skewed. Jervis has treated the subject in exhausting and methodical detail, offering countless historical examples for each point and carefully noting exceptions and caveats. He offers many important insights that every policymaker should internalize. So why three stars? This book is a dreadful read, especially for the practitioner looking for a distillation of wisdom. It will fail to communicate its message except to the most determined reader, and will send many bleary-eyed IR students Googling for summaries.
Completely nerded out on this one too. Describes deterrence and spiral theory, points out the holes in each (not that they are not useful, but that neither universally applies) and posits that the real question in international relations is to figure out when it is appropriate to use which. In the author's eyes, it's all about the intent of the other actor. As such, it is crucial that perception be as true as possible - all effort must be taken to avoid misperception.
Author's thesis is that, while many treat misperceptions as random accidents, they can in fact often be predicted and avoided. Goes on to describe how perceptions are formed, how decision-makers learn, and common sources of misperception.
Bottom line: It's all about understanding and acknowledging your own biases in order to accurately perceive the other actor's intent (and anticipate his own misperceptions of you).
The book focuses on the issue of decision-making in IR and the continual problem of making decisions based on limited information. Misperception as a term is problematic. How can someone misperceive a situation and how can it truly be measured? Isn't the person just acting rationally at the time and the scholar or secondhand observer considers the person to misperceive? My conclusion remains that the subject acts rationally given the situation and the options present at the moment of the decision.
The text is fundamental and essential to IR but levels of analysis issues are especially present in Jervis's text. Looking at the individual or domestic actors as a variable for an outcome remains incredibly difficult. But if anything, the book identifies again recognizes the individual (like the classical realists) as an important factor in IR. The use of psychological factors to substantiate claims makes the reader struggle even further to find quality cases for analysis and to look at Jervis's chosen cases as examples of true misperception -- after all, some of his examples are during the biggest wars in world history. Certainly, the individual is vital but their level of importance remains unknown.
The book is very long and a hard read, but gives a comprehensive overview. A short summary is that states and politicians have various motivations and it's really easy to misunderstand the motivations of others to end up in a situation neither of you wanted. And there are many theories and examples on how to do it - a lot from the beginning and mid-20th century, so it's useful to be familiar with the history of those to have more context. The part discussing behaviour in the context of contemporary psychological research is perhaps a bit outdated.
A detailed analysis of how perceptions and misperceptions lead to errors in decision making in issues of fundamental importance for war and peace. Some parts of the book are a bit too theoretically heavy and technical. But overall a very good book.
Jervis really is as good as people say, but he's a hard read - I can't say I've gotten through this all the way, but only referred to specific descriptions of chapters.
In Perception and Misperception, Robert Jervis takes the baton from Allison and Kahneman and expands the argument that the world is a very complicated place and that our brains overcome this with perceptions and analogies.
The strength of this is that we can rapidly make sense of complicated situations, but the downside is that we are very prone to misperception. Thinking about how we think and how our adversaries think is very important.
These concepts are also very important in understanding how hard it is to be a critical thinker. His discussion of the contradictory rationality of deterrence and spiral theory conjures the need for thinking more about security dilemmas. Doing so, may reveal whether capabilities or intentions are more important to me.
We talked about how perceptions are built on past experiences, deep routed beliefs and cognitive consistency. We also discussed how the security dilemma can lead to self-fulfilling prophecies.
Knowing our propensity for misperception helps us understand ourselves and our enemies, specifically playing devil’s advocate, exposing assumptions, and being aware of our misperceptions.
This book is a must read for anyone that wants to understand how decisions in international politics are made. It is an easy jump to realize that most people make decisions everyday based on the same principles presented in this book.
The main thesis of this book is buried in the middle of it: "Thus statesmen underestimate the costs of forming preliminary hypotheses and so form images more quickly than they would if they understood the processes at work." This sums up the book well. It suggests that decision makers use their previous experience and perceptions to make decisions quickly. Jervis also looks in depth at Deterrence theory and Spiral theory. There is not a decisive point that says what theory is best, rather he argues that a mixture of the theories is appropriate. Jervis also looks at cognitive dissonance. He attempts to show that decision makers overvalue their own thoughts and influences. If things go well, it was because of them…if things go poorly, there was obviously an outside influence.
A brilliant work that draws attention to the psychological factors influencing foreign policy decision-making and international relations. And from my experience working on Asia-Pacific affairs, it seems that this book has had an important influence on actual policy discourse in the decades since it was written, as it is likely an important impetus for the abundant attention directed to misperception and mistrust among states in the region. Definitely a bit redundant and wordy at times, but its substantive content is important enough that I'm still giving it five stars.
Political psychology meets international politics. And the result is a fascinating approach to understanding the dynamics of international politics. Here, Robert Jrevis applies psychological theory to explaining how decision-makers operate--and how their decisions often go wrong as they misperceive the context in which they operate. He also closes the book by noting how decision-makers might enhance the quality of those decisions and reduce the effects of misperception. . . .
Jervis' classic book looks at the role of misperception in international politics, and assesses the extent to which it can be used to explain decisions my elites. Specifically it focuses on the role of decision-making in circumstances of limited information, as well as the role that cognitive biases play in the interpretation of such information.