More on this book
Community
Kindle Notes & Highlights
‘Human error’ requires a standard. For the attribution to make any sense at all, it requires the possibility of actions or assessments that are not, or would not have been, erroneous.
Putting in more rules, procedures and compliance demands runs into the problem that there is always a gap between how work is imagined (in rules or procedures) and how work is done. Pretending that this gap does not exist is like sticking your head in the sand. And trying to force the gap to close with more compliance demands and threats of sanctions will drive real practice from view.
says that what people do makes sense to them at the time—given their goals, attentional focus and knowledge—otherwise they wouldn’t be doing it. In other words: people do not come to work to do a bad job. Pilots do not check in for a flight in order to die. Nurses do not sign in to go kill a patient (and if they do, it takes you into the realm of sabotage, criminality, terrorism which requires different explanations and interventions—not part of this book).
The “tunnel.” Understanding ‘human error’ is about understanding the “inside” perspective—not the outside or hindsight one
psychotechnik,
A “Bad Apple” problem, to the extent that you can prove its existence, is a system problem and a system responsibility.
“error” is not a cause of trouble but a symptom of trouble.
Even the editors of the British Medical Journal have decided that a Bad Apple problem is a systems problem, and that addressing it is a systems responsibility: the time has come, they say, to design and evaluate systems that identify problematic individuals. So in healthcare, too, the recruitment and retaining of staff who turn out ineffective in their role is a systems issue—not one of defective personal accountability.
A systems approach understands that each component or contributor in a system has specific responsibilities to help attain the system’s overall goals.