Goodreads helps you follow your favorite authors. Be the first to learn about new releases!
Start by following Sidney Dekker.

Sidney Dekker Sidney Dekker > Quotes


Sidney Dekker quotes (showing 1-27 of 27)

“Accidents are no longer accidents at all. They are failures of risk management.”
Sidney Dekker, Just Culture
“Not being able to find a cause is profoundly distressing; it creates anxiety because it implies a loss of control. The desire to find a cause is driven by fear.”
Sidney Dekker, Just Culture
“If professionals consider one thing “unjust,” it is often this: Split-second operational decisions that get evaluated, turned over, examined, picked apart, and analyzed for months—by people who were not there when the decision was taken, and whose daily work does not even involve such decisions.”
Sidney Dekker, Just Culture
“Accountability can mean letting people tell their account, their story.”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“There is almost no human action or decision that cannot be made to look flawed and less sensible in the misleading light of hindsight.”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“Unjust responses to failure are almost never the result of bad performance. They are the result of bad relationships.”
Sidney Dekker, Just Culture
“People do not come to work to do a bad job. Safety in complex systems is not a result of getting rid of people, of reducing their degrees of freedom. Safety in complex systems is created by people through practice—at all levels of an organization.”
Sidney Dekker, The Field Guide to Understanding Human Error
“Arriving at the edge of chaos is a logical endpoint for drift. At the edge of chaos, systems have tuned themselves to the point of maximum capability.”
Sidney Dekker, Drift into Failure: From Hunting Broken Components to Understanding Complex Systems
“Asking what is the cause, is just as bizarre as asking what is the cause of not having an accident. Accidents have their basis in the real complexity of the system, not their apparent simplicity.”
Sidney Dekker, The Field Guide to Understanding Human Error
“forward-looking accountability.”2 Accountability that is backward-looking (often the kind in trials or lawsuits) tries to find a scapegoat, to blame and shame an individual for messing up. But accountability is about looking ahead. Not only should accountability acknowledge the mistake and the harm resulting from it, it should lay out the opportunities (and responsibilities!) for making changes so that the probability of such harm happening again goes down.”
Sidney Dekker, Just Culture
“It has to do with being open, with a willingness to share information about safety problems without the fear of being nailed for them.”
Sidney Dekker, Just Culture
“Safety improvements come from organizations monitoring and understanding the gap between proceedures and practice”
Sidney Dekker, Field Guide to Understanding Human Error
“Underneath every simple, obvious story about ‘human error,’ there is a deeper, more complex story about the organization.”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“There is almost no human action or decision that cannot be made to look flawed and less sensible in the misleading light of hindsight. It is essential that the critic should keep himself constantly aware of that fact.”
Sidney Dekker, Just Culture
“Challenges to existing views are generally uncomfortable. Indeed, for most people and organizations, coming face to face with a mismatch between what they believed and what they have just experienced is difficult. These people and organizations will do anything to reduce the nature of the surprise.”
Sidney Dekker, The Field Guide to Understanding Human Error
“A just culture accepts nobody’s account as “true” or “right” and others wrong.”
Sidney Dekker, Just Culture
“Creating a climate in which disclosure is possible and acceptable is the organization’s responsibility.”
Sidney Dekker, Just Culture
“The main question for a just culture is not about matching consequences with outcome. It is this: Did the assessments and actions of the professionals at the time make sense, given their knowledge, their goals, their attentional demands, their organizational context?”
Sidney Dekker, Just Culture
“But errors are consequences: the leakage that occurs around the edges when you put pressure on a system without taking other factors into account.”
Sidney Dekker, The Field Guide to Understanding Human Error
“The question is not how pilots can be so silly to rely on this, but how they have been led to believe (built up a mental model) that this is actually effective.”
Sidney Dekker, The Field Guide to Understanding Human Error
“Indeed, automation does not fail often, which limits people’s ability to practice the kinds of breakdown scenarios that still justifies their marginal presence in the system. Here, the human is painted as a passive monitor, whose greatest safety risks would lie in deskilling, complacency, vigilance decrements and the inability to intervene in deteriorating circumstances.”
Sidney Dekker, The Field Guide to Understanding Human Error
“Saying what people failed to do has no role in understanding ‘human error.”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“What is the cause of the accident? This question is just as bizarre as asking what the cause is of not having an accident.”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“In complex systems, after all, it is very hard to foresee or predict the consequences of presumed causes. So it is not the consequences that we should be afraid of (we might not even foresee them or believe them if we could). Rather, we should be weary of renaming things that negotiate their perceived risk down from what it was before.”
Sidney Dekker, Drift into Failure: From Hunting Broken Components to Understanding Complex Systems
“If we adjudicate an operator’s understanding of an unfolding situation against our own truth, which includes knowledge of hindsight, we may learn little of value about why people saw what they did, and why taking or not taking action made sense to them.”
Sidney Dekker, Drift into Failure: From Hunting Broken Components to Understanding Complex Systems
“There is almost no human action or decision that cannot be made to look flawed and less sensible in the misleading light of hindsight. It is essential that the critic should keep himself constantly aware of that fact.”1”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“The question that drives safety work in a just culture is not who is responsible for failure, rather, it asks what is responsible for things going wrong. What is the set of engineered and organized circumstances that is responsible for putting people in a position where they end up doing things that go wrong?”
Sidney Dekker, Just Culture


All Quotes | Add A Quote
Play The 'Guess That Quote' Game

Drift Into Failure: From Hunting Broken Components to Understanding Complex Systems Drift Into Failure
105 ratings
Just Culture: Balancing Safety and Accountability Just Culture
97 ratings
Patient Safety: A Human Factors Approach Patient Safety
18 ratings