Goodreads helps you keep track of books you want to read.
Start by marking “Normal Accidents: Living with High-Risk Technologies” as Want to Read:
Normal Accidents: Living with High-Risk Technologies
Enlarge cover
Rate this book
Clear rating
Open Preview

Normal Accidents: Living with High-Risk Technologies

4.03  ·  Rating details ·  393 ratings  ·  38 reviews
Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, t ...more
Paperback, 464 pages
Published October 17th 1999 by Princeton University Press (first published 1984)
More Details... edit details

Friend Reviews

To see what your friends thought of this book, please sign up.

Reader Q&A

To ask other readers questions about Normal Accidents, please sign up.

Be the first to ask a question about Normal Accidents

Community Reviews

Showing 1-30
Rating details
Sort: Default
Michael Burnam-Fink
Oct 06, 2014 rated it liked it
Shelves: 2014, sts, academic
Normal Accidents is a momument in the field of research into sociotechnial systems, but while eminently readable and enjoyable, it is dangerously under-theorized. Perrow introduces the idea of the 'Normal Accident', the idea that within complex and tightly coupled sociotechnical systems, catastrophe is inevitable. The addition of more oversight and more safety devices merely adds new failure modes and encourages the operation of existing systems with thinner margins for error. Perrow provides nu ...more
Nov 14, 2008 rated it it was amazing
Shelves: current-affairs
8/14/2011 I keep recommending this book and with the BP disaster, it continues to be very, very timely. One of the points made by Perrow is that when complex technology "meets large corporate and government hierarchies, lack of accountability will lead inexorably to destructive failures of systems that might have operated safely." (From a review of A Sea in Flames: The Deepwater Horizon Oil Blowout by Gregg Easterbrook in the NY Times April 23, 2011.)

Note added 3/2/09: Perrow's discussion of the
Peter Mcloughlin
This book covers a concept that complex technologies will be subject to normal accidents. Accidents are usually attributed to design failures or human error but the nature of complex technologies that have parts that are closely coupled to each other makes catastrophic failure so likely that it is a normal occurrence hence a normal accident. This book was written in the eighties so it opens with the Three Mile Island nuclear accident in 1979 and goes on to talk about problems in the Nuclear indu ...more
Feb 14, 2012 rated it did not like it
The book discusses various systems and their tendency to fail. It is full of evidence of various failures in the past but it's very dry and boring to crunch through all of the stories. Also I was very furious about author's attempt to draw conclusions and generalization from something that was anecdotal evidence at most.
Nov 24, 2007 rated it really liked it
Sea Story: I worked as a shipboard Radio Officer for Exxon Shipping company on their tanker fleet, and I spent 30 days aboard the Exxon Valdez shortly after it came out of the shipyard as a brand new tanker. When I was home on vacation, a neighbor called and told me about the ship grounding in Prince William Sound. I happened to be with a former Captain of the Valdez. The first thing out of his mouth was, "I hope it was _______" (the last name of one of the two Captains who rotated tours on that ...more
Man was this book ever a slog. I had high hopes for it - I have a morbid habit of reading accounts of failure analysis. Plus, I thought I might learn something useful, since I work on a complex system, where (at least we'd like to think) we've done a pretty good job of planning for single component failures, but we do still have potential for unanticipated system interactions. I was aware this book was written by a sociologist, but I thought it might be even better for that reason - could be goo ...more
Ryder Author Resources
Mar 21, 2017 rated it it was amazing
I first learned about this book from the bibliography of a Michael Chrichton novel (I think -- Airframe, maybe?) more than 20 years ago, and I've read it three times since. I'm sure not everyone would find it as riveting as I did, and it can get a little dry and/or repetitive at times, but it's a fascinating exploration of complex systems.

Many negative reviews focus on the book's failings, such as the fact that Perrow's "doomsday" predictions haven't come to pass or that he doesn't offer workabl
AJ Armstrong
Sep 17, 2014 rated it it was ok
Shelves: abandoned
I had high hopes for this oft-cited work, but it unfortunately continues the perfect record sociologists have for disappointing me. While there us certainly good primary research in evidence, and the discussion of coupling in complex systems would have been valuable at the time of initial publication (it scans a bit trite now), the author clearly has an inexplicably luddite agenda. It is obvious, almost from the outset of the work, that rather than analyzing complex systems with an interest to i ...more
Kevin J. Rogers
Feb 05, 2008 rated it really liked it
Recommends it for: Students of management and/or technology.
Dr. Perrow makes a striking point in this excellent analysis of the risks of complex technology: the very engineering safeguards that are intended to make high-risk technologies safer may in fact make them riskier by adding a an additional level of complexity, one that has more to do with the perceptions and interactions of the people intending to manage the system than the system itself. The solution, according to Dr. Perrow, is a two-dimesional analytical framework combining complex vs. linear ...more
Feb 12, 2016 rated it liked it
I had heard quite a bit about this book, and it mainly delivered. I enjoyed the attempts to quantify system accidents. I appreciated the the invention of a framework to discuss accidents, work systems, and victims. I also enjoyed the numerous case studies presented in the book because it is in the specific cases we can glen ideas at prevention.

Unfortunately, I found myself disagreeing with some of the main conclusions of his analysis, particularly relating to abandoning certain kinds of technol
(Read for ES module)
Although this looks like a dense academic textbook on high tech technology, it's actually very accessible and absolutely fascinating to read. Has some excellent points about how over-complicating technology often leads to accidents which could have easily been prevented had a simpler method been implemented instead. Covers parts on nuclear accidents such as Chernobyl and Three Mile Island etc.
This is a re-read. Highly recommended book. May serve as the core of my thesis, so I am definitely a fan.
Karen Bilo
Nov 10, 2018 rated it it was ok
It is a well researched book and he clearly knows what he's talking about and is pulling from a litany of experts, but it is repetitive and a bit pedantic. One of his later books - Meltdown - has the essence of this book but put into an easier to digest format. This is definitely the more academic version though, i'd consider Meltdown to be the pop version.

The other problem with this is that it was clearly written a long time ago and hasn't included any recent disasters.
Ferhat Culfaz
Jan 08, 2018 rated it it was amazing
Superb! Founder of NAT (Normal Accident Theory). Widely cited and applied by a number of organisations.

One should read this book just because once you have read it, it will make you look at systems, complex ones in particular, in a fundamentally different way, especially interrelationships, safety design errors, human factors etc.

Georg Lehner
Aug 21, 2018 rated it it was amazing  ·  review of another edition
Enlightening reading on the assesment of systems with respect to their risk potential.
Also enlightening the reflection on social ethics and responsibilities with respect to the technical risks. This book empowers the everyday person to get an informed view on technical and industrial developments.
Dee Eisel
Nov 09, 2015 rated it liked it
Shelves: science, disasters
This is a tome. It's dense, it's dated (he's talking about disasters of the late 70s and early 80s most of the time), and it's difficult to penetrate. For people who understand what he's talking about, I bet they get a lot out of it. For me, it was a struggle. I began this book in July and finally finished it today.

But I did get a few things out of it. I understand more of the idea of coupling, of events that seemingly inevitably lead one to another because of the way a system works. I am gettin
Ian Tymms
It's hard to put labels on this book. It's about complexity and systems theory and sociological analysis of workplace relations on the surface, but in drawing all these areas together it also presented a fundamentally humanist analysis countering the pragmatic and rationalist perspectives that are more common in this field.

Beginning with the idea that in complex systems predicting all eventualities is, by definition, impossible, Perrow argues that we must accept that some kinds of accidents wil
Dec 09, 2011 rated it really liked it
Shelves: pensum

This is a great book
and that is rare for academic texts

Why does it still matter? It`s 18 years since it was published, what relevance could it still have upon our everyday life?


For all its shortcomings, This is a important book

Not for what it shows us, but for what it forces us to think about


This book deals with the problems of complex organizations, and then the author comes with suggestions to solve these problems.
And that is what is the biggest problem with this book

His suggestions are either
Eugene Miya
Nov 30, 2015 rated it it was amazing
While a little dated, and a little too hardware oriented, this is an important book to read. Maybe should be read along with Atul Gawande's books (commentaries on medical problems). Other reviewers think Perrow is a Luddite. Might be true, but it's important to read intelligence on the "opposition". A similar read but not as established yet is Trees on Mars: Our Obsession with the Future.

Perrow is required reading from early ARPAnet mailing lists (RISKS-DIGEST/comp.risks (Usenet). Peter Neumann'
Aug 29, 2010 rated it really liked it
Shelves: non-fiction
An interesting look at how complex systems can lead to unexpected reactions, and that as systems get more and more complex, these accidents will become more and more likely to occur (though the exact nature of the accident will be unpredictable). The book is quite interesting, but a little dense and not quite as easy to read as other pop-science books I've read recently.

The book was originally published in 1984 with an update in 1999, and I would be interested to see a further update. He predict
Matt Brown
I read this as part of a reading club at work, since it's commonly claimed that our complex computer systems exhibit many of same sort of failure modes and system incidents as described in this book.

I found it an interesting read, and certainly there are some core truths and useful concepts presented in the book, primarily in the first half, but the latter portions of the book, particularly the chapters on recombinant DNA didn't really resonate with me at all.

The age of the book is clearly evide
Mar 20, 2009 rated it it was amazing
Shelves: non-fiction
This book is an excellent discussion on how systems go wrong because of unforeseen relationships: ships and planes that end up crashing because the galley electrical systems get overloaded, things like that. I think it's a must-read for anyone involved in complex system design, or in the design of "simple" systems that are part of larger, more complex ones.

One thing I'd caution against, though: I read this on a plane! It only freaked me out a little, but I kept wondering if I was scaring any of
Mark Terry
Jul 05, 2011 rated it really liked it
This was tougher than I expected. The concepts were important. The examples were cogent and meaningful. The writing seemed accessible, but I found this took longer and was more distant than other books on the subject that I have encountered. This is a landmark work on the subject, and I absolutely recommend it to anyone serious about safety. Just be prepared for an academic treatment of the subject. With all that implies -- good and bad.
Apr 25, 2015 rated it liked it
Liked the first half. Was a good perspective and root cause analysis of a variety of accidents. Got a little too preachy for me in concluding that for some dangerous industries (those that are complex, tightly coupled and contain catastrophic potential) that either government needs more oversight or that the process should not be pursued. I think we can do better than that.
Feb 08, 2015 rated it it was amazing
"Well, I, uh, don't think it's quite fair to condemn a whole program because of a single slip-up, sir." - Dr Strangelove. This book is a brilliant combination of history, engineering, psychology, and sociology. Despite being 31 years old (which allowed it to eerily predict the Chernobyl accident), it has only gained in relevance as our lives become more intertwined with complex systems.
TK Keanini
Apr 08, 2007 rated it it was amazing
I heard of this book from one of Bruce Schneier's podcasts. It was in this book that I learned of the notion of tightly and loosely coupled systems. Very good read if you are designing complex systems.
Vasil Kolev
There are some interesting ideas and stories in this book, but there wasn't enough on the systems accidents, and the theoretical part was either not too much or too muddled. The afterword on the Y2K was especially bad.
Forrest Horton
Feb 19, 2014 rated it it was ok
Perrow (a Yale professor) makes an interesting argument about inherent risk involved with complex systems. Unfortunately, this book seems terribly out-of-date; without analysis of our advanced computer technologies, this book might as well be obsolete.
Phil Moyer
Jan 15, 2016 rated it it was amazing
One of the classics of Risk and Risk Assessment/Management. This book is a bit like reading the archives of the Risk forum at SRI, but with much more science and theory supporting the case studies. Very much worth the read, even if the book is getting a little long in the tooth now.
« previous 1 3 4 5 6 7 8 9 next »
There are no discussion topics on this book yet. Be the first to start one »
  • The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA
  • The Logic of Failure: Recognizing and Avoiding Error in Complex Situations
  • Why Things Bite Back: Technology and the Revenge of Unintended Consequences
  • Field Guide to Understanding Human Error
  • Sorting Things Out: Classification and Its Consequences
  • Spam: A Shadow History of the Internet
  • Inviting Disaster: Lessons From the Edge of Technology
  • Success Through Failure: The Paradox of Design
  • The Reflective Practitioner: How Professionals Think in Action
  • The Battle of $9.99: How Apple, Amazon, and the Big Six Publishers Changed the E-Book Business Overnight
  • Human Error
  • Managing the Unexpected: Resilient Performance in an Age of Uncertainty
  • Ignition!: An informal history of liquid rocket propellants
  • A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming
  • Visual and Statistical Thinking: Displays of Evidence for Decision Making
  • Usability Engineering
  • Leviathan and the Air-Pump: Hobbes, Boyle, and the Experimental Life
  • The Sinews of Power: War, Money and the English State, 1688-1783

Goodreads is hiring!

If you like books and love to build cool products, we may be looking for you.
Learn more »
“Organizational theorists, at least since Burns and Stalker, 1961 and Joan Woodward, 1965 in what came to be called the contingency school, have recognized that centralization is appropriate for organizations with routine tasks, and decentralization for those with nonroutine tasks.” 2 likes
“example of a phenomenon that will concern us in this chapter: production pressures in this high-risk system.” 0 likes
More quotes…