
“Dangerous systems usually required standardized procedures and some form of centralized control to prevent mistakes. That sort of management was likely to work well during routine operations. But during an accident, Perrow argued, “those closest to the system, the operators, have to be able to take independent and sometimes quite creative action.” Few bureaucracies were flexible enough to allow both centralized and decentralized decision making, especially in a crisis that could threaten hundreds or thousands of lives. And the large bureaucracies necessary to run high-risk systems usually resented criticism, feeling threatened by any challenge to their authority. “Time and time again, warnings are ignored, unnecessary risks taken, sloppy work done, deception and downright lying practiced,” Perrow found. The instinct to blame the people at the bottom not only protected those at the top, it also obscured an underlying truth. The fallibility of human beings guarantees that no technological system will ever be infallible.”
―
Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety
Share this quote:
Friends Who Liked This Quote
To see what your friends thought of this quote, please sign up!
5 likes
All Members Who Liked This Quote
This Quote Is From

14,268 ratings, average rating, 1,563 reviews
Open Preview
Browse By Tag
- love (100830)
- life (78991)
- inspirational (75448)
- humor (44158)
- philosophy (30745)
- inspirational-quotes (28664)
- god (26809)
- truth (24613)
- wisdom (24407)
- romance (24242)
- poetry (23103)
- life-lessons (22318)
- quotes (20635)
- death (20483)
- happiness (18916)
- hope (18426)
- faith (18282)
- inspiration (17219)
- spirituality (15616)
- travel (15437)
- relationships (15383)
- religion (15326)
- motivational (15235)
- life-quotes (15187)
- love-quotes (15043)
- writing (14892)
- success (14121)
- motivation (13078)
- time (12795)
- science (12026)