Goodreads helps you keep track of books you want to read.
Start by marking “Normal Accidents: Living with High-Risk Technologies” as Want to Read:
Normal Accidents: Living with High-Risk Technologies
Enlarge cover
Rate this book
Clear rating
Open Preview

Normal Accidents: Living with High-Risk Technologies

4.02 of 5 stars 4.02  ·  rating details  ·  198 ratings  ·  23 reviews
This text analyzes the social side of technological risk. It argues that the conventional engineering approach to ensuring safety, building more warnings and safeguards, fails because systems complexity makes failures inevitable. The author asserts that typical precautions, by adding to complexity, may help create new categories of accidents. By recognizing two dimensions ...more
Paperback, 464 pages
Published October 1st 1999 by Princeton University Press (first published 1984)
more details... edit details

Friend Reviews

To see what your friends thought of this book, please sign up.

Reader Q&A

To ask other readers questions about Normal Accidents, please sign up.

Be the first to ask a question about Normal Accidents

This book is not yet featured on Listopia. Add this book to your favorite list »

Community Reviews

(showing 1-30 of 765)
filter  |  sort: default (?)  |  rating details
8/14/2011 I keep recommending this book and with the BP disaster, it continues to be very, very timely. One of the points made by Perrow is that when complex technology "meets large corporate and government hierarchies, lack of accountability will lead inexorably to destructive failures of systems that might have operated safely." (From a review of A Sea in Flames: The Deepwater Horizon Oil Blowout by Gregg Easterbrook in the NY Times April 23, 2011.)

Note added 3/2/09: Perrow's discussion of the
Michael Burnam-fink
Normal Accidents is a momument in the field of research into sociotechnial systems, but while eminently readable and enjoyable, it is dangerously under-theorized. Perrow introduces the idea of the 'Normal Accident', the idea that within complex and tightly coupled sociotechnical systems, catastrophe is inevitable. The addition of more oversight and more safety devices merely adds new failure modes and encourages the operation of existing systems with thinner margins for error. Perrow provides nu ...more
The book discusses various systems and their tendency to fail. It is full of evidence of various failures in the past but it's very dry and boring to crunch through all of the stories. Also I was very furious about author's attempt to draw conclusions and generalization from something that was anecdotal evidence at most.
AJ Armstrong
I had high hopes for this oft-cited work, but it unfortunately continues the perfect record sociologists have for disappointing me. While there us certainly good primary research in evidence, and the discussion of coupling in complex systems would have been valuable at the time of initial publication (it scans a bit trite now), the author clearly has an inexplicably luddite agenda. It is obvious, almost from the outset of the work, that rather than analyzing complex systems with an interest to i ...more
Sea Story: I worked as a shipboard Radio Officer for Exxon Shipping company on their tanker fleet, and I spent 30 days aboard the Exxon Valdez shortly after it came out of the shipyard as a brand new tanker. When I was home on vacation, a neighbor called and told me about the ship grounding in Prince William Sound. I happened to be with a former Captain of the Valdez. The first thing out of his mouth was, "I hope it was _______" (the last name of one of the two Captains who rotated tours on that ...more
Kevin J. Rogers
Feb 05, 2008 Kevin J. Rogers rated it 4 of 5 stars
Recommends it for: Students of management and/or technology.
Dr. Perrow makes a striking point in this excellent analysis of the risks of complex technology: the very engineering safeguards that are intended to make high-risk technologies safer may in fact make them riskier by adding a an additional level of complexity, one that has more to do with the perceptions and interactions of the people intending to manage the system than the system itself. The solution, according to Dr. Perrow, is a two-dimesional analytical framework combining complex vs. linear ...more
This is a re-read. Highly recommended book. May serve as the core of my thesis, so I am definitely a fan.
Nov 12, 2012 Ole rated it 4 of 5 stars
Shelves: pensum

This is a great book
and that is rare for academic texts

Why does it still matter? It`s 18 years since it was published, what relevance could it still have upon our everyday life?


For all its shortcomings, This is a important book

Not for what it shows us, but for what it forces us to think about


This book deals with the problems of complex organizations, and then the author comes with suggestions to solve these problems.
And that is what is the biggest problem with this book

His suggestions are either
"Well, I, uh, don't think it's quite fair to condemn a whole program because of a single slip-up, sir." - Dr Strangelove. This book is a brilliant combination of history, engineering, psychology, and sociology. Despite being 31 years old (which allowed it to eerily predict the Chernobyl accident), it has only gained in relevance as our lives become more intertwined with complex systems.
An interesting look at how complex systems can lead to unexpected reactions, and that as systems get more and more complex, these accidents will become more and more likely to occur (though the exact nature of the accident will be unpredictable). The book is quite interesting, but a little dense and not quite as easy to read as other pop-science books I've read recently.

The book was originally published in 1984 with an update in 1999, and I would be interested to see a further update. He predict
Matt Brown
I read this as part of a reading club at work, since it's commonly claimed that our complex computer systems exhibit many of same sort of failure modes and system incidents as described in this book.

I found it an interesting read, and certainly there are some core truths and useful concepts presented in the book, primarily in the first half, but the latter portions of the book, particularly the chapters on recombinant DNA didn't really resonate with me at all.

The age of the book is clearly evide
This book is an excellent discussion on how systems go wrong because of unforeseen relationships: ships and planes that end up crashing because the galley electrical systems get overloaded, things like that. I think it's a must-read for anyone involved in complex system design, or in the design of "simple" systems that are part of larger, more complex ones.

One thing I'd caution against, though: I read this on a plane! It only freaked me out a little, but I kept wondering if I was scaring any of
Mark Terry
This was tougher than I expected. The concepts were important. The examples were cogent and meaningful. The writing seemed accessible, but I found this took longer and was more distant than other books on the subject that I have encountered. This is a landmark work on the subject, and I absolutely recommend it to anyone serious about safety. Just be prepared for an academic treatment of the subject. With all that implies -- good and bad.
(Read for ES module)
Although this looks like a dense academic textbook on high tech technology, it's actually very accessible and absolutely fascinating to read. Has some excellent points about how over-complicating technology often leads to accidents which could have easily been prevented had a simpler method been implemented instead. Covers parts on nuclear accidents such as Chernobyl and Three Mile Island etc.
Forrest Horton
Perrow (a Yale professor) makes an interesting argument about inherent risk involved with complex systems. Unfortunately, this book seems terribly out-of-date; without analysis of our advanced computer technologies, this book might as well be obsolete.
Vasil Kolev
There are some interesting ideas and stories in this book, but there wasn't enough on the systems accidents, and the theoretical part was either not too much or too muddled. The afterword on the Y2K was especially bad.
TK Keanini
I heard of this book from one of Bruce Schneier's podcasts. It was in this book that I learned of the notion of tightly and loosely coupled systems. Very good read if you are designing complex systems.
Feb 02, 2008 Bimus rated it 4 of 5 stars
Recommends it for: people who fancy disasters
Recommended to Bimus by: graduate school
Stuff happens because other stuff happens that we aren't necessarily aware of or can't/won't control. It can lead to big messes. Expose of 'human error' in famous tragedies of modern normal life.
Tom King
A classic for a certain type of reader and deservedly so. Even if you, like me, do not have a particular interest in nuclear plant meltdowns or industrial accidents.
Ryan Barrett
Inspired me to write a lengthy blog post comparing it to software development.
Minor chapter on aviation accidents. Overly scientific explanations of the factors that contribute to accidents that are difficult to follow for the average person.
Read this with Human Error in thinking about risk and how and why things go wrong.
Sushil marked it as to-read
Apr 30, 2015
Nick marked it as to-read
Apr 30, 2015
John marked it as to-read
Apr 28, 2015
Les marked it as to-read
Apr 25, 2015
Ryan Mcgunnigle
Ryan Mcgunnigle marked it as to-read
Apr 25, 2015
« previous 1 3 4 5 6 7 8 9 25 26 next »
There are no discussion topics on this book yet. Be the first to start one »
  • The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA
  • The Logic Of Failure: Recognizing And Avoiding Error In Complex Situations
  • The Field Guide to Understanding Human Error
  • Why Things Bite Back: Technology and the Revenge of Unintended Consequences
  • Sorting Things Out: Classification and Its Consequences
  • The Reflective Practitioner: How Professionals Think in Action
  • Inviting Disaster: Lessons From the Edge of Technology
  • The Nature of Technology: What It Is and How It Evolves
  • Usability Engineering
  • Ignition!: An informal history of liquid rocket propellants
  • Managing the Unexpected: Resilient Performance in an Age of Uncertainty
  • Leviathan and the Air-Pump: Hobbes, Boyle, and the Experimental Life
  • Success Through Failure: The Paradox of Design
  • The Battle of $9.99: How Apple, Amazon, and the Big Six Publishers Changed the E-Book Business Overnight
  • Visual and Statistical Thinking: Displays of Evidence for Making Decisions
  • Networks of Outrage and Hope: Social Movements in the Internet Age
  • The Design of Design: Essays from a Computer Scientist
  • A Demon of Our Own Design: Markets, Hedge Funds, and the Perils of Financial Innovation
Complex Organizations: A Critical Essay Organizing America: Wealth, Power, and the Origins of Corporate Capitalism The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters Organizational Analysis: A Sociological View The AIDS Disaster: The Failure of Organizations in New York and the Nation

Share This Book

“Organizational theorists, at least since Burns and Stalker, 1961 and Joan Woodward, 1965 in what came to be called the contingency school, have recognized that centralization is appropriate for organizations with routine tasks, and decentralization for those with nonroutine tasks.” 2 likes
More quotes…