(?)
Quotes are added by the Goodreads community and are not verified by Goodreads. (Learn more)
Eric Schlosser

“Dangerous systems usually required standardized procedures and some form of centralized control to prevent mistakes. That sort of management was likely to work well during routine operations. But during an accident, Perrow argued, “those closest to the system, the operators, have to be able to take independent and sometimes quite creative action.” Few bureaucracies were flexible enough to allow both centralized and decentralized decision making, especially in a crisis that could threaten hundreds or thousands of lives. And the large bureaucracies necessary to run high-risk systems usually resented criticism, feeling threatened by any challenge to their authority. “Time and time again, warnings are ignored, unnecessary risks taken, sloppy work done, deception and downright lying practiced,” Perrow found. The instinct to blame the people at the bottom not only protected those at the top, it also obscured an underlying truth. The fallibility of human beings guarantees that no technological system will ever be infallible.”

Eric Schlosser, Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety
Read more quotes from Eric Schlosser


Share this quote:
Share on Twitter

Friends Who Liked This Quote

To see what your friends thought of this quote, please sign up!


This Quote Is From

Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety by Eric Schlosser
14,268 ratings, average rating, 1,563 reviews
Open Preview

Browse By Tag