Goodreads helps you keep track of books you want to read.
Start by marking “Normal Accidents: Living with High-Risk Technologies” as Want to Read:
Normal Accidents: Living with High-Risk Technologies
Enlarge cover
Rate this book
Clear rating
Open Preview

Normal Accidents: Living with High-Risk Technologies

really liked it 4.00  ·  Rating Details  ·  268 Ratings  ·  31 Reviews
"Normal Accidents" analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, ...more
Paperback, 464 pages
Published October 17th 1999 by Princeton University Press (first published 1984)
More Details... edit details

Friend Reviews

To see what your friends thought of this book, please sign up.

Reader Q&A

To ask other readers questions about Normal Accidents, please sign up.

Be the first to ask a question about Normal Accidents

This book is not yet featured on Listopia. Add this book to your favorite list »

Community Reviews

(showing 1-30 of 1,083)
filter  |  sort: default (?)  |  Rating Details
Aug 14, 2011 Eric_W rated it it was amazing
Shelves: current-affairs
8/14/2011 I keep recommending this book and with the BP disaster, it continues to be very, very timely. One of the points made by Perrow is that when complex technology "meets large corporate and government hierarchies, lack of accountability will lead inexorably to destructive failures of systems that might have operated safely." (From a review of A Sea in Flames: The Deepwater Horizon Oil Blowout by Gregg Easterbrook in the NY Times April 23, 2011.)

Note added 3/2/09: Perrow's discussion of the
Peter Mcloughlin
This book covers a concept that complex technologies will be subject to normal accidents. Accidents are usually attributed to design failures or human error but the nature of complex technologies that have parts that are closely coupled to each other makes catastrophic failure so likely that it is a normal occurrence hence a normal accident. This book was written in the eighties so it opens with the Three Mile Island nuclear accident in 1979 and goes on to talk about problems in the Nuclear indu ...more
Michael Burnam-fink
Oct 06, 2014 Michael Burnam-fink rated it liked it
Shelves: academic, sts, 2014
Normal Accidents is a momument in the field of research into sociotechnial systems, but while eminently readable and enjoyable, it is dangerously under-theorized. Perrow introduces the idea of the 'Normal Accident', the idea that within complex and tightly coupled sociotechnical systems, catastrophe is inevitable. The addition of more oversight and more safety devices merely adds new failure modes and encourages the operation of existing systems with thinner margins for error. Perrow provides nu ...more
Aug 12, 2013 Jan rated it did not like it
The book discusses various systems and their tendency to fail. It is full of evidence of various failures in the past but it's very dry and boring to crunch through all of the stories. Also I was very furious about author's attempt to draw conclusions and generalization from something that was anecdotal evidence at most.
AJ Armstrong
Oct 03, 2014 AJ Armstrong rated it it was ok
Shelves: abandoned
I had high hopes for this oft-cited work, but it unfortunately continues the perfect record sociologists have for disappointing me. While there us certainly good primary research in evidence, and the discussion of coupling in complex systems would have been valuable at the time of initial publication (it scans a bit trite now), the author clearly has an inexplicably luddite agenda. It is obvious, almost from the outset of the work, that rather than analyzing complex systems with an interest to i ...more
Dec 11, 2007 Patricia rated it really liked it
Sea Story: I worked as a shipboard Radio Officer for Exxon Shipping company on their tanker fleet, and I spent 30 days aboard the Exxon Valdez shortly after it came out of the shipyard as a brand new tanker. When I was home on vacation, a neighbor called and told me about the ship grounding in Prince William Sound. I happened to be with a former Captain of the Valdez. The first thing out of his mouth was, "I hope it was _______" (the last name of one of the two Captains who rotated tours on that ...more
Kevin J. Rogers
Feb 05, 2008 Kevin J. Rogers rated it really liked it
Recommends it for: Students of management and/or technology.
Dr. Perrow makes a striking point in this excellent analysis of the risks of complex technology: the very engineering safeguards that are intended to make high-risk technologies safer may in fact make them riskier by adding a an additional level of complexity, one that has more to do with the perceptions and interactions of the people intending to manage the system than the system itself. The solution, according to Dr. Perrow, is a two-dimesional analytical framework combining complex vs. linear ...more
Eugene Miya
Nov 30, 2015 Eugene Miya rated it it was amazing
While a little dated, and a little too hardware oriented, this is an important book to read. Maybe should be read along with Atul Gawande's books (commentaries on medical problems). Other reviewers think Perrow is a Luddite. Might be true, but it's important to read intelligence on the "opposition". A similar read but not as established yet is Trees on Mars: Our Obsession with the Future.

Perrow is required reading from early ARPAnet mailing lists (RISKS-DIGEST/comp.risks (Usenet). Peter Neumann'
Dee Bitner
Nov 11, 2015 Dee Bitner rated it liked it
Shelves: science, disasters
This is a tome. It's dense, it's dated (he's talking about disasters of the late 70s and early 80s most of the time), and it's difficult to penetrate. For people who understand what he's talking about, I bet they get a lot out of it. For me, it was a struggle. I began this book in July and finally finished it today.

But I did get a few things out of it. I understand more of the idea of coupling, of events that seemingly inevitably lead one to another because of the way a system works. I am gettin
Ian Tymms
It's hard to put labels on this book. It's about complexity and systems theory and sociological analysis of workplace relations on the surface, but in drawing all these areas together it also presented a fundamentally humanist analysis countering the pragmatic and rationalist perspectives that are more common in this field.

Beginning with the idea that in complex systems predicting all eventualities is, by definition, impossible, Perrow argues that we must accept that some kinds of accidents wil
This is a re-read. Highly recommended book. May serve as the core of my thesis, so I am definitely a fan.
Man was this book ever a slog. I had high hopes for it - I have a morbid habit of reading accounts of failure analysis. Plus, I thought I might learn something useful, since I work on a complex system, where (at least we'd like to think) we've done a pretty good job of planning for single component failures, but we do still have potential for unanticipated system interactions. I was aware this book was written by a sociologist, but I thought it might be even better for that reason - could be goo ...more
Nov 12, 2012 Ole rated it really liked it
Shelves: pensum

This is a great book
and that is rare for academic texts

Why does it still matter? It`s 18 years since it was published, what relevance could it still have upon our everyday life?


For all its shortcomings, This is a important book

Not for what it shows us, but for what it forces us to think about


This book deals with the problems of complex organizations, and then the author comes with suggestions to solve these problems.
And that is what is the biggest problem with this book

His suggestions are either
Feb 08, 2015 Phil rated it it was amazing
"Well, I, uh, don't think it's quite fair to condemn a whole program because of a single slip-up, sir." - Dr Strangelove. This book is a brilliant combination of history, engineering, psychology, and sociology. Despite being 31 years old (which allowed it to eerily predict the Chernobyl accident), it has only gained in relevance as our lives become more intertwined with complex systems.
May 22, 2016 Geoffry rated it liked it
I had heard quite a bit about this book, and it mainly delivered. I enjoyed the attempts to quantify system accidents. I appreciated the the invention of a framework to discuss accidents, work systems, and victims. I also enjoyed the numerous case studies presented in the book because it is in the specific cases we can glen ideas at prevention.

Unfortunately, I found myself disagreeing with some of the main conclusions of his analysis, particularly relating to abandoning certain kinds of technol
Phil Moyer
Jan 15, 2016 Phil Moyer rated it it was amazing
One of the classics of Risk and Risk Assessment/Management. This book is a bit like reading the archives of the Risk forum at SRI, but with much more science and theory supporting the case studies. Very much worth the read, even if the book is getting a little long in the tooth now.
Jun 30, 2013 Ben rated it really liked it
Shelves: non-fiction
An interesting look at how complex systems can lead to unexpected reactions, and that as systems get more and more complex, these accidents will become more and more likely to occur (though the exact nature of the accident will be unpredictable). The book is quite interesting, but a little dense and not quite as easy to read as other pop-science books I've read recently.

The book was originally published in 1984 with an update in 1999, and I would be interested to see a further update. He predict
Matt Brown
I read this as part of a reading club at work, since it's commonly claimed that our complex computer systems exhibit many of same sort of failure modes and system incidents as described in this book.

I found it an interesting read, and certainly there are some core truths and useful concepts presented in the book, primarily in the first half, but the latter portions of the book, particularly the chapters on recombinant DNA didn't really resonate with me at all.

The age of the book is clearly evide
Mar 20, 2009 Frankie rated it it was amazing
Shelves: non-fiction
This book is an excellent discussion on how systems go wrong because of unforeseen relationships: ships and planes that end up crashing because the galley electrical systems get overloaded, things like that. I think it's a must-read for anyone involved in complex system design, or in the design of "simple" systems that are part of larger, more complex ones.

One thing I'd caution against, though: I read this on a plane! It only freaked me out a little, but I kept wondering if I was scaring any of
Mark Terry
Sep 26, 2011 Mark Terry rated it really liked it
This was tougher than I expected. The concepts were important. The examples were cogent and meaningful. The writing seemed accessible, but I found this took longer and was more distant than other books on the subject that I have encountered. This is a landmark work on the subject, and I absolutely recommend it to anyone serious about safety. Just be prepared for an academic treatment of the subject. With all that implies -- good and bad.
(Read for ES module)
Although this looks like a dense academic textbook on high tech technology, it's actually very accessible and absolutely fascinating to read. Has some excellent points about how over-complicating technology often leads to accidents which could have easily been prevented had a simpler method been implemented instead. Covers parts on nuclear accidents such as Chernobyl and Three Mile Island etc.
Jul 20, 2015 Les rated it liked it
Liked the first half. Was a good perspective and root cause analysis of a variety of accidents. Got a little too preachy for me in concluding that for some dangerous industries (those that are complex, tightly coupled and contain catastrophic potential) that either government needs more oversight or that the process should not be pursued. I think we can do better than that.
Forrest Horton
Sep 08, 2014 Forrest Horton rated it it was ok
Perrow (a Yale professor) makes an interesting argument about inherent risk involved with complex systems. Unfortunately, this book seems terribly out-of-date; without analysis of our advanced computer technologies, this book might as well be obsolete.
Vasil Kolev
There are some interesting ideas and stories in this book, but there wasn't enough on the systems accidents, and the theoretical part was either not too much or too muddled. The afterword on the Y2K was especially bad.
TK Keanini
Apr 08, 2007 TK Keanini rated it it was amazing
I heard of this book from one of Bruce Schneier's podcasts. It was in this book that I learned of the notion of tightly and loosely coupled systems. Very good read if you are designing complex systems.
Feb 02, 2008 Bimus rated it really liked it
Recommends it for: people who fancy disasters
Recommended to Bimus by: graduate school
Stuff happens because other stuff happens that we aren't necessarily aware of or can't/won't control. It can lead to big messes. Expose of 'human error' in famous tragedies of modern normal life.
Tom King
Jan 26, 2013 Tom King rated it it was amazing
A classic for a certain type of reader and deservedly so. Even if you, like me, do not have a particular interest in nuclear plant meltdowns or industrial accidents.
Ryan Barrett
Apr 21, 2013 Ryan Barrett rated it really liked it
Inspired me to write a lengthy blog post comparing it to software development.
Oct 21, 2012 Wayne rated it it was ok
Minor chapter on aviation accidents. Overly scientific explanations of the factors that contribute to accidents that are difficult to follow for the average person.
Dec 15, 2008 Alan rated it really liked it
Shelves: management
Read this with Human Error in thinking about risk and how and why things go wrong.
« previous 1 3 4 5 6 7 8 9 36 37 next »
There are no discussion topics on this book yet. Be the first to start one »
  • The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA
  • The Logic of Failure: Recognizing and Avoiding Error in Complex Situations
  • Why Things Bite Back: Technology and the Revenge of Unintended Consequences
  • The Field Guide to Understanding Human Error
  • Sorting Things Out: Classification and Its Consequences
  • Spam: A Shadow History of the Internet
  • Inviting Disaster: Lessons From the Edge of Technology
  • Success Through Failure: The Paradox of Design
  • The Reflective Practitioner: How Professionals Think in Action
  • The Battle of $9.99: How Apple, Amazon, and the Big Six Publishers Changed the E-Book Business Overnight
  • Human Error
  • Managing the Unexpected: Resilient Performance in an Age of Uncertainty
  • Ignition!: An informal history of liquid rocket propellants
  • A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming
  • Visual and Statistical Thinking: Displays of Evidence for Decision Making
  • Usability Engineering
  • Leviathan and the Air-Pump: Hobbes, Boyle, and the Experimental Life
  • The Sinews of Power: War, Money and the English State, 1688-1783

Share This Book

“Organizational theorists, at least since Burns and Stalker, 1961 and Joan Woodward, 1965 in what came to be called the contingency school, have recognized that centralization is appropriate for organizations with routine tasks, and decentralization for those with nonroutine tasks.” 2 likes
“example of a phenomenon that will concern us in this chapter: production pressures in this high-risk system.” 0 likes
More quotes…