Engineering a Safer World Quotes
Engineering a Safer World: Systems Thinking Applied to Safety
by
Nancy G. Leveson93 ratings, 4.15 average rating, 14 reviews
Engineering a Safer World Quotes
Showing 1-16 of 16
“Safety is a system property, not a component property, and must be controlled at the system level, not the component level.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“The goal then is to understand why people did not or could not act differently. People acted the way they did for very good reasons; we need to understand why the behavior of the people involved made sense to them at the time”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“In highly automated systems, the operator is often at the mercy of the system design and operational procedures.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“Making all the components highly reliable will not necessarily make the system safe.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“Hindsight bias and focusing only on the operator's role in accidents prevents us from fully learning from accidents and making significant progress in improving safety.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“Insisting that operators always follow procedures does not guarantee safety although it does usually guarantee that there is someone to blame-either for following the procedures or for not following them-when things go wrong.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“Human error is not random. It results from basic human mental abilities and physical skills combined with the features of the tools being used, the tasks assigned, and the operating environment.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“It seems to be common, judging from the number of incidents and accidents that have resulted, for software designers to forget that the world
continues to change even though the software may not be operating.”
― Engineering a Safer World: Systems Thinking Applied to Safety
continues to change even though the software may not be operating.”
― Engineering a Safer World: Systems Thinking Applied to Safety
“Safety starts with management leadership and commitment. Without these, the efforts of others in the organization are almost doomed to failure. Leadership creates culture, which drives behavior.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“Virtually all systems contain humans, but engineers are often not taught much about human factors and draw convenient boundaries around the technical components, focusing their attention inside these artificial boundaries.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“When operators receive input about the state of the system being controlled, they will first try to fit that information into their current mental model and will find reasons to exclude information that does not fit. Because operators are continually testing their mental models against reality (see figure 2.9), the longer a model has been held and the more different sources of information that led to that incorrect model, the more resistant the models will be to change due to conflicting information, particularly ambiguous information.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“A basic principle of system theory is that no control system will perform better than its measuring channel.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“Large-scale engineered systems are more than just a collection of technological artifacts: They are a reflection of the structure, management, procedures, and culture of the engineering organization that created them. They are usually also a reflection
of the society in which they were created.”
― Engineering a Safer World: Systems Thinking Applied to Safety
of the society in which they were created.”
― Engineering a Safer World: Systems Thinking Applied to Safety
“identifying only operator error or sabotage as the root cause of the accident ignores most of the opportunities for the prevention of similar accidents in the future.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“viewing accidents as chains of events and conditions may limit understanding and learning from the loss and omit causal factors that cannot be included in an event chain.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
“The most common accident causality models assume that accidents are caused by component failure and that making system components highly reliable or planning for their failure will prevent accidents. While this assumption is true in the relatively simple electromechanical systems of the past, it is no longer true for the types of complex sociotechnical systems we are building today.”
― Engineering a Safer World: Systems Thinking Applied to Safety
― Engineering a Safer World: Systems Thinking Applied to Safety
