Why do we make mistakes? Are there certain errors common to failure, whether in a complex enterprise or daily life? In this truly indispensable book, Dietrich Dörner identifies what he calls the “logic of failure”—certain tendencies in our patterns of thought that, while appropriate to an older, simpler world, prove disastrous for the complex world we live in now. Working with imaginative and often hilarious computer simulations, he analyzes the roots of catastrophe, showing city planners in the very act of creating gridlock and disaster, or public health authorities setting the scene for starvation. The Logic of Failure is a compass for intelligent planning and decision-making that can sharpen the skills of managers, policymakers and everyone involved in the daily challenge of getting from point A to point B.
The main gist of this book is that human beings don't always behave logically. This is entirely valid, but not very interesting to me. And anyway there are many other books on that topic. The attraction of this book was the promise of something that could help to deal with "complex situations." Unfortunately, the book is about computer simulations--NOT about "avoiding error" in the real world. It makes sense to rely on sims when it's not possible to study something in the real world, for example with global warming. Otherwise, it's preferable to look at what has actually worked. If the author's games represented a good method for building skills in dealing with complex situations then he could test that prospectively in the real world. But he offers no evidence that playing these games leads to good relevant outcomes.
Dietrich Dörner is an authority on cognitive behavior and a psychology professor at the University of Bamberg, Germany. His research shows that our habits as problem solvers are typically counterproductive.
Probably our main shortcoming is that we like to oversimplify problems. Dörner offers a long list of self-defeating behaviors, but common to all of them is our reluctance to see any problem is part of a whole system of interacting factors. Any problem is much more complex than we like to believe. And failure doesn't have to come from incompetence. The operators of the Chernobyl reactor, as Dörner points out, were "experts." And as experts, they ignored safety standards because they "knew what they were doing."
Dörner identifies four habits of mind and characteristics of thought that account for the frequency of our failures: 1. The slowness of our thinking-We streamline the process of problem solving to save time and energy. 2. Our wish to feel confident and competent in our problem solving abilities-We try to repeat past successes. 3. Our inability to absorb quickly and retain large amounts of information-We prefer unmoving mental models, which cannot capture a dynamic, ever-changing process. 4. Our tendency to focus on immediately pressing problems-We ignore the problems our solutions will create.
Successful problem solving is so complex that there are no hard-and-fast rules that work all the time. The best take-away from the book (and this is my favorite quote): "An individual's reality model can be right or wrong, complete or incomplete. As a rule it will be both incomplete and wrong, and one would do well to keep that probability in mind." The book is 199 easy-to-read pages, and Dörner gives lots of interesting examples from lab tests illustrating people's actual behavior in problem-solving situations.
It's a thought-provoking book for anyone whose job is to tackle complex problems. In one way or another that includes anyone in just about any profession.
Ultimately, the author of this book was just stringing a bunch of anecdotes together to try to prove that simulated gaming would be a good training method. I work with people who make a living at develooing and employing such games, and know how to actually use the scientific method, unlike this guy, who couldn't even bother to find ay scientific references for the poijnts he was trying to make.
In addition, his basic attitude was that people fail becuse they arent't as smart as he is, although he was clearly trying really hard not to come out and say this directly. Bad book.
"[in relationship to good and bad planners] The good participants differed from the bad ones in that they tested their hypotheses. The bad participants failed to do this. For them, to propose a hypothesis was to understand reality; testing that hypothesis was unnecessary. Instead of generating hypotheses, they generated 'truths'."
Or in other words, dangling truths unconnected from reality.
The entire book is full of real world case studies and experiments showing how bad planning is the result of inadequate understanding of the interrelationships between complex systems, inability to visualize non-linear chains of causation and more importantly, the inadequate self-testing of the relationships between our mental models of reality and reality itself. Dorner also digs into the effects of exponential processes can have on system failures in terms of their speed and size (his best example is Chernobyl).
These errors all come down to problems of cognition and the difficulty we face in imagining certain levels of complexity, and he provides a lot of advice into how to remedy them in planning stages, basically by asking the right questions and thinking in the right way. Highly recommend this to anyone interested in how systems fail, be it real world or in software.
This is the best book I've read on the patterns common to all failures. It cuts to a very fundamental limit of our cognition and reasoning. When we build systems that challenge our perception or cognitive capacities, we are just asking for trouble.
This is a book I always have on hand for reference.
There is a lot to learn from simulation-games, board or computer, in seeing how good or poor the modelling is of situation.
One of the classics of poor decision making were some of the Therac-25 Radiation Therapy Machines where keys didn'[t work properly and people had to overide the controls and people got burned and died from the machines.
one can look this creepy situation
"For six unfortunate patients in 1986 and 1987, the Therac-25 did the unthinkable: it exposed them to massive overdoses of radiation, killing four and leaving two others with lifelong injuries. During the investigation, it was determined that the root cause of the problem was twofold. Firstly, the software controlling the machine contained bugs which proved to be fatal. Secondly, the design of the machine relied on the controlling computer alone for safety. There were no hardware interlocks or supervisory circuits to ensure that software bugs couldn’t result in catastrophic failures."
"The case of the Therac-25 has become one of the most well-known killer software bugs in history. "
I did not like this book. I am no academic, but I was appalled by the fact that the findings of this book are founded on such a small population of observations made in game-like scenarios. And then, to move from the specific to the generic, with broad strokes of the brush. A presumptuous book full of truisms. The one thing to take away from this book is that real-life systems are complex and not easily understood by humans.
The author of this book is a social scientist who uses computer simulations as a way of studying human problem-solving behavior. Some of his insights into why people have problems dealing with complex situations: - People have trouble understanding processes that work over time. People tend to respond to the situation, rather than to the process that produces the situation, leading them to overshoot or undershoot in their response. - Both good problem solvers and bad problem solvers develop hypotheses. But only good problem solvers bother to test them. - People often get sucked into solving the problems that are easy to solve, rather than the problems that are most important.
There's lots of food for thought in this book, and some recommendations for people who would like to improve their complex problem-solving abilities. (One major recommendation: play more simulation-type computer games. All those hours you spent playing Civilization? You were honing your problem-solving skills.)
My one quibble with the book is that I would have liked to see the author apply his findings to more real-world cases of bad decision making. The book does have an absolutely fascinating analysis of the chain of bad decisions that led to the Chernobyl disaster, and I'd have loved to see more of that kind of work.
Concise, direct and up to the point. The Logic of Failure is an exploration of some of the prominent factors affecting our ability to plan and act. Factors that are at no lack to anyone, but seem to slip away through our common sense (which just means our present and available brain capacity or capabilities); factors such as temporal configurations, realizations of non-linear relationships, tendency to employ methodism, etc. There are many factors affecting our planning and actions, of considerable importance is the ability to track and monitor our actions. Complex systems by definition are too complicated for us to be able to draw out single cause-effect relations, and as a matter of fact there are no such relations, cause-effect relations are compounded, interrelated and dependent. The book describes many of these factors which I encourage everyone to explore. Borrowing from the book, It is not possible to teach and learn all these factors, but a lot of simulated exposures can hone our common sense to the circumstances of a situation.
When Cass Sunstein call a book a classic (with similar praise from Tim Harford); it's definitely worth checking out.
Until reading this book, I had no idea this type of thinking/analysis was a thing but now that I've the book, I can't believe I lived an existence without employing this type of analysis.
The world is full of complex dynamic multi-variable interconnected problems that our brains systematically fail to comprehend. Through experience dealing with these systems (and reading this book!), we can learn to overcome these heuristics.
We need to: - define clear goals - think non-linearly (and truly comprehend compound growth) - think of all possible solutions available - understand the problems that don't exist yet but could - prioritize the things we are optimizing for - seek feedback and learn from our actions - understand how different variables dynamically impact each other (beyond single cause and effect) - understand delayed feedback.
This should be required reading for anyone in charge of ANYTHING!
Interesting breakdown of what makes complex problems complex, and how we can train our brains to better assess problems in a dynamic world (hint: change over time and the specific conditions a problem exists within are often overlooked). However, Dörner relies a bit too heavily on results of experiments where participants play computer simulations (think academia's version of The Sims) to make his points; the 25th graph is only marginally more informative than the 10th. The book drags, despite discussing very thoughtful psychological findings. Ends on a completely bizarre note that we should all just play video games to train our complex problem solving skills - really? That's the best encouragement you can muster in the face of continued and predictable human failure?
Great book that got me to thinking aboput taking whole or systems thinking when it comes to complex problem solving. We all too often think about only our role in the problem solving prcess which can have detrimental effects in the long term. The Logic of Failure lays out the foundation behind failures and then does a masterful job of explaining the how too learn from and minimize failure with both an explicit and implict view. I recommend the book to any looking tpo becominga better problem solver and decision maker
Dietrich relates some curious anecdotes of how humans fail when dealing with-computer simulated or real-situations, however, the general conclusions that he extracts are rather obvious:
- It is hard to control complex scenarios with multiple parts that influence each other. - It is hard to predict the future of non-linear (exponential or oscillatory) situations.
These problems are not so obvious or apparent when dealing with real-time decision making, however, the book teaches nothing on how to overcome them apart from saying: be careful.
The Logic of Failure is a popular translation of what appears to be some pretty hefty scholarly literature (I think-didn't bother to actually check 30 years of literature in German), that is hindered by becoming largely accepted wisdom. Dorner is a cognitive scientist who based this book on a series of studies of how people interacted with computer models: desertification in the Sahel, the economy and politics of a small town, predator and prey interactions. These studies, along with some examples drawn from recent events like Chernobyl and military history, are used to explain failure a consequence of a lack of understanding of complex systems.
Complex systems, interconnected networks with time-delays, buffering units, hidden keystone variables, and unclear indicators, are everywhere in the real world. Unfortunately, human minds tend to think linearly and concretely. Dorner documents several pathological thinking styles he encounters in his experiments. Some people over-correct, making dramatic changes while chasing a pointer that drowned out any data in induced oscillations. Some people get lost chasing irrelevant details, asking for more information rather than acting. And some people get trapped in methodism, following a predetermined course of action in complete disregard of the information coming in.
Against this, Dorner advocates for having a clear mental model of a system, discrete objectives, and a holistic sense of possible higher-order effects. Make small changes, seek steady states, and do not try and race a chaotic system. He points towards 'wisdom' with maddening vagueness. If there's a major problem with this book, it's that it's been overtaken by the zeitgeist. Dorner's methods are now children's toys rather than cutting edge science. We all 'get' networks and complexity, but we still lack the language to truly understand them.
I re-read this recently and it held up quite well.It is probably the closest thing to self-help, psychology or a business advice book I would read. I'm not getting soft or developing ambitions of business consulting; it is academic, analytical, and focuses on failures and fiascos. The subject is how poorly and predictably our intuition and thought patterns serves us in complex situations.
In addition to real world situations (Chernobyl, for example) the author describes various laboratory experiments in which volunteers (victims?) run through simulations which are designed to make them fail. In fact, Doerner can rather predictably get participants to fail even when they are given essentially complete knowledge of the situation and no individual aspect is beyond the comprehension of a 3rd grader--for one simple example, asking people to try and control the temperature of a freezer with access to the chiller power, but not a thermostat. (This one brought back memories of undergrad labs, and my impatience running old-fashioned melting point apparatuses that used oil baths and a power knob.)
There is a good bit of sadistic thrill in watching very educated the volunteer victims people flail miserably at tasks specifically designed to steer them into their worst patterns of thought. All for the good of science, of course.
"On s'engage partout, et puis l'on voit" - Napoleon, which roughly translates to "One jumps into the fray, then figures out what to do next."
We live nonlinear lives. When you fill up your gas tank, drive 50 miles and the fuel gauge barely budges, then you drive another 50 and it plunges. That's a classic nonlinear relationship. The math is hard, we hate doing that kind of math and we kid ourselves into believing that we can generalize such relationships. We can't. All we can do is recognize that complex situations are hard and assess them as they come and not try to predict the future.
I really like this book in that he believes heavily in simulations and steers away from real world anecdotes. Failure has been a big part of my own life. It's really easy for me to look at my own life experiences and try to extrapolate out, unduly. For every complex situation, there is a solution that is clear, simple, but wrong.
I think the most telling part of the book is when Dörner shows how humans by in large focus on as Donald Rumsfeld famously said "known-knowns" but we ignore the "known-unknowns". That inability for us to tackle problems that we know exist but are bad at dealing with can have cascading effects.
"The world is richer than it is possible to express in any single language." ~ Ilya Prigogine
Not what I expected. Rather than looking at real situations, this book looks at awesome videogame simulations of highly chaotic (in the mathematic sense) planning scenarios! Like playing discretized sim city, or playing 'balance the predators and prey'. The whole time they were describing how the good vs bad players played I was thinking 'come on, this basically seems like an IQ test'. Then they said that it didn't correlate at all with IQ, and talked about cool other rationality things. Like it's not coming up with ideas, but testing them. Not asking questions, but asking why questions.
Other main points are "people have a hard time with many-variable nonlinear chaotic systems" (not an actual quote) and the correlary - they can basically only do linear modeling. (That statement should go the other direction with the corellary word.)
Then there were some boring parts and stuff about centralized category models being good and bad that I wish there was more of.
It also has a great description of Chernobyl and then cuts off with "and the rest is common knowledge". Asshole.
Herr Dorner gives us an important book that provides insights into human shortcomings in recognizing and dealing with complex situations. It’s not that we’re not smart, but our cognitive processes are in turn, petulant, impatient, and lazy. Happily, Dorner provides some assistance in meeting the challenge of complex situations; so, this volume has both theoretical and practical applications. This work has important implications for military strategists, statesmen, and public policy practitioners. Dorner warns that youth and intelligence, well-intentioned though it may be, often come up woefully short when arrayed against complex systems. He notes, contrary to Hollywood’s heroic and misleading imagery, that experience and the ability to learn from one’s mistakes, fare comparatively better. These are timely admonitions to those who would try to “turn the world in the palms of our hand,” to use Al Stewart’s apt phraseology. Despite the heavy subject matter, the writing is crisp and lucid; kudos to whomever translated it from German.
Working with intriguing computer simulations of his own invention,Dorner exposes these flaws in our thinking. His examples-sometimes hilarious,sometimes horrifying-and brain-teasing thought experiments teach us how to solve complex problems.Awesome work for planning and decision making that bolster rational thinking skills of businessman or government officials or single mom..
When I started reading the book, I found it very interesting to see the fails in the human way of thinking in complex situations. Especially if you correlate this to computer programming and complex software systems. But after a while the book theme was constantly repeating the same pattern: a simulation where people made bad decisions and an analysis what these bad decisions where. At no point was there a mention on how to change this. Only in the end did the author try to handle this question and the answer was kinda disappointing: the answer was "play more simulations". Well unfortunately, the author seems to fall for the trap he set. He must keep in mind that a simulation is a software with some mathematical models. Unfortunately though as the author correctly started before, life and thinking is not math. So relying on math to teach you how to is also problematic. I would prefer if the author tried to give hints in every scenario about how to avoid these mistakes instead of filling the book with scenarios of failures and in the end provide no clear suggestion (actually I would prefer no suggestion than a chapter of a couple of pages with the final "solution" of more simulations..
I’ve read a lot of books this year that I wouldn’t recommend to everyone. This one I do recommend to everyone, or at least everyone intent on improving their general thinking and problem solving abilities.
Dörner examines numerous thinking traps and methods to overcome them by examining a number of interesting case studies, both real and simulated. He holds the reader’s attention and keeps the theoretical grounded in the practical.
At no point does Dörner recommend a single approach. Instead, he presents a menu of thinking methods and reflection tools for one to employ to improve their thought and problem solving.
Very good, but a dense read. This was translated from German in a very literal way. Sometimes the sentence structure feels clunky for English speakers.
That said, the translation is my only negative criticism. I enjoyed the way the author broke down complex situations into a series of simple steps and events. I don’t have any advanced training but was able to understand everything.
Recommended for fans of Daniel Kahneman. I found myself thinking of his pre-mortem technique often as I read this book.
Full disclosure that I read this for a class and as a result skimmed some sections. That said, this was a surprisingly excellent and useful read. In short, Dörner's argument is that people are really, really bad at understanding the complexity of situations facing them. Most people either latch on to a single facet of a problem they understand, or the first one that is presented to them, without digging in further. The book is fairly short, and his explanations are intuitive with lots of interesting implications -e.g. why people believe in conspiracies. I'd recommend for anyone.
Sehr gutes Buch, welches die Fehler, die beim Treffen von Entscheidungen gemacht werden, beschreibt. So werden beispielsweise Ziele nicht klar definiert, zu wenig (oder zu viel) Informationen gesammelt oder Fehler ignoriert, um die Wahrnehmung über die vermeintlich hohe eigene Kompetenz aufrechtzuerhalten. Sehr gute Denkanstöße und Lösungen dabei und eine große Motivation, das eigene Denken immer wieder zu hinterfragen.
Ein bisschen wie schnelles Denken, langsames Denken mit Fokus auf Systemtheorie. Dafür dass das Buch 35(!) Jahre alt ist und mit vielen Beispielen und Ergebnissen von Planspielen aus der Zeit arbeitet, überraschend aktuell.
Auch interessant wie viele Konzepte von heute, damals schon gedacht waren. Die Klammer was wichtig ist, wird auch klar gesetzt. Es fehlen dann aber aus heutiger Sicht doch einige Elemente, wie man sein Denken bestmöglich einsetzt (z.B. Übung in der Meditation).
Main takeaway “We human beings are creatures of the present. But in the world of today we must learn to think in temporal configurations. We must learn that there is a lag time between the execution of a measure and its effect. We must learn to recognize “shapes” in time. We must learn that events have not only their immediate, visible effects but long term repercussions as well.”