The Field Guide to Understanding Human Error Quotes

Rate this book
Clear rating
The Field Guide to Understanding Human Error The Field Guide to Understanding Human Error by Sidney Dekker
613 ratings, 4.22 average rating, 70 reviews
The Field Guide to Understanding Human Error Quotes Showing 1-17 of 17
“Underneath every simple, obvious story about ‘human error,’ there is a deeper, more complex story about the organization.”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“There is almost no human action or decision that cannot be made to look flawed and less sensible in the misleading light of hindsight.”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“There is almost no human action or decision that cannot be made to look flawed and less sensible in the misleading light of hindsight. It is essential that the critic should keep himself constantly aware of that fact.”1”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“Valujet flight 592 crashed after takeoff from Miami airport because oxygen generators in its cargo hold caught fire. The generators had been loaded onto the airplane by employees of a maintenance contractor, who were subsequently prosecuted. The editor of Aviation Week and Space Technology “strongly believed the failure of SabreTech employees to put caps on oxygen generators constituted willful negligence that led to the killing of 110 passengers and crew. Prosecutors were right to bring charges. There has to be some fear that not doing one’s job correctly could lead to prosecution.”13 But holding individuals accountable by prosecuting them misses the point. It shortcuts the need to learn fundamental lessons, if it acknowledges that fundamental lessons are there to be learned in the first place. In the SabreTech case, maintenance employees inhabited a world of boss-men and sudden firings, and that did not supply safety caps for expired oxygen generators. The airline may have been as inexperienced and under as much financial pressure as people in the maintenance organization supporting it. It was also a world of language difficulties—not only because many were Spanish speakers in an environment of English engineering language: “Here is what really happened. Nearly 600 people logged work time against the three Valujet airplanes in SabreTech’s Miami hangar; of them 72 workers logged 910 hours across several weeks against the job of replacing the ‘expired’ oxygen generators—those at the end of their approved lives. According to the supplied Valujet work card 0069, the second step of the seven-step process was: ‘If the generator has not been expended install shipping cap on the firing pin.’ This required a gang of hard-pressed mechanics to draw a distinction between canisters that were ‘expired’, meaning the ones they were removing, and canisters that were not ‘expended’, meaning the same ones, loaded and ready to fire, on which they were now expected to put nonexistent caps. Also involved were canisters which were expired and expended, and others which were not expired but were expended. And then, of course, there was the simpler thing—a set of new replacement canisters, which were both unexpended and unexpired.”14 These were conditions that existed long before the Valujet accident, and that exist in many places today. Fear of prosecution stifles the flow of information about such conditions. And information is the prime asset that makes a safety culture work. A flow of information earlier could in fact have told the bad news. It could have revealed these features of people’s tasks and tools; these longstanding vulnerabilities that form the stuff that accidents are made of. It would have shown how ‘human error’ is inextricably connected to how the work is done, with what resources, and under what circumstances and pressures.”
Sidney Dekker, The Field Guide to Understanding Human Error
“This is at the heart of the professional pilot’s eternal conflict,” writes Wilkinson in a comment to the November Oscar case. “Into one ear the airlines lecture, “Never break regulations. Never take a chance. Never ignore written procedures. Never compromise safety.” Yet in the other they whisper, “Don’t cost us time. Don’t waste our money. Get your passengers to their destination—don’t find reasons why you can’t.”
Sidney Dekker, The Field Guide to Understanding Human Error
“But the point of a ‘human error’ investigation is to understand why people’s assessments and actions made sense at the time, given their context, and without knowledge of outcome, not to point out what they should have done instead.”
Sidney Dekker, The Field Guide to Understanding Human Error
“Safety and risk are made and broken the whole time, throughout your organization. You are not the custodian of an otherwise safe system that you need to protect from erratic human beings.”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“But errors are consequences: the leakage that occurs around the edges when you put pressure on a system without taking other factors into account.”
Sidney Dekker, The Field Guide to Understanding Human Error
“Indeed, automation does not fail often, which limits people’s ability to practice the kinds of breakdown scenarios that still justifies their marginal presence in the system. Here, the human is painted as a passive monitor, whose greatest safety risks would lie in deskilling, complacency, vigilance decrements and the inability to intervene in deteriorating circumstances.”
Sidney Dekker, The Field Guide to Understanding Human Error
“The question is not how pilots can be so silly to rely on this, but how they have been led to believe (built up a mental model) that this is actually effective.”
Sidney Dekker, The Field Guide to Understanding Human Error
“Challenges to existing views are generally uncomfortable. Indeed, for most people and organizations, coming face to face with a mismatch between what they believed and what they have just experienced is difficult. These people and organizations will do anything to reduce the nature of the surprise.”
Sidney Dekker, The Field Guide to Understanding Human Error
“Asking what is the cause, is just as bizarre as asking what is the cause of not having an accident. Accidents have their basis in the real complexity of the system, not their apparent simplicity.”
Sidney Dekker, The Field Guide to Understanding Human Error
“People do not come to work to do a bad job. Safety in complex systems is not a result of getting rid of people, of reducing their degrees of freedom. Safety in complex systems is created by people through practice—at all levels of an organization.”
Sidney Dekker, The Field Guide to Understanding Human Error
“What is the cause of the accident? This question is just as bizarre as asking what the cause is of not having an accident.”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“Accountability can mean letting people tell their account, their story.”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“Saying what people failed to do has no role in understanding ‘human error.”
Sidney Dekker, The Field Guide to Understanding 'Human Error'
“Safety improvements come from organizations monitoring and understanding the gap between proceedures and practice”
Sidney Dekker, The Field Guide to Understanding Human Error