More on this book
Community
Kindle Notes & Highlights
we will find that in all these instances the explanation for success hinges, in powerful and often counterintuitive ways, on how we react to failure.
contrast two of the most important safety-critical industries in the world today: health care and aviation.
the most profound difference is in their divergent approaches to failure.
Every aircraft is equipped with two almost-indestructible black boxes, one of which records instructions sent to the onboard electronic systems, and another that records the conversations and sounds in the cockpit.*
This insures that procedures can be changed so that the same error never happens again.
In health care, however, things are very different.
published a landmark investigation called “To Err Is Human.” It reported that between 44,000 and 98,000 Americans die each year as a result of preventable medical errors.
Journal of Patient Safety8 put the number of premature deaths associated with preventable harm at more than 400,000 per year.
These figures place preventable medical error in hospitals as the third biggest killer in the United States—behind only heart disease and cancer.
The problem is not a small group of crazy, homicidal, incompetent doctors going around causing havoc. Medical errors follow a normal bell-shaped distribution.
Why, then, do so many mistakes happen? One of the problems is complexity.
Another problem is scarce resources.
A third issue is that doctors may have to make quick decisions.
there is also something deeper and more subtle at work, something that has little to do with resources, and everything to do with culture.
errors committed in hospitals (and in other areas of life) have particular trajectories, subtle but predictable patterns: what accident investigators call “signatures.”
a failure to learn from mistakes has been one of the single greatest obstacles to human progress.
A progressive attitude to failure turns out to be a cornerstone of success for any institution.
Society, as a whole, has a deeply contradictory attitude to failure.
we are quick to blame others who mess up.
We have a deep instinct to find scapegoats.
this has recursive effects, as we shall see. It is partly because we are so willing to blame others for their mistakes that we are so keen to conceal our own.
The net effect is simple: it obliterates openness and spawns cover-ups. It destroys the vital information we need in order to learn.
Studies have shown that we are often so worried about failure that we create vague goals,
We come up with face-saving excuses, even before we have attempted anything.
We cover up mistakes, not only to protect ourselves from others, but to protect us from ourselves.
This basic perspective—that failure is profoundly negative, something to be ashamed of in ourselves and judgmental about in others—has deep cultural and psychological roots.
The purpose of this book is to offer a radically different perspective.
we need to redefine our relationship with failure,
redefining failure will we unleash progress, creativity, and resilience.
the idea of a “closed loop,”
they never subjected the treatment to a proper test—and so they never detected failure.
Doctors were effectively killing patients for the better part of 1,700 years not because they lacked intelligence or compassion, but because they did not recognize the flaws in their own procedures.
for our purposes a closed loop is where failure doesn’t lead to progress because information on errors and weaknesses is misinterpreted or ignored; an open loop does lead to progress because the feedback is rationally acted upon.)
He realized that the mistake may have had a “signature,” a subtle pattern that, if acted upon, could save future lives. The doctors in charge of the operation couldn’t have known this for a simple but devastating reason: historically, health-care institutions have not routinely collected data on how accidents happen, and so cannot detect meaningful patterns, let alone learn from them.
our attitude to progress: instead of denying failure, or spinning it, aviation learns from failure.
what is important for our purposes is not the similarity between the two accidents; it is the difference in response.
In aviation, things are radically different: learning from failure is hardwired into the system.
In the aftermath of the investigation the report is made available to everyone. Airlines have a legal responsibility to implement the recommendations. Every pilot in the world has free access to the data.
“Learn from the mistakes of others.
You can’t live long enough to make them all yourself.”
so, too, do “small...
This highlight has been truncated due to consecutive passage length restrictions.
This, then, is what we might call “black box thinking.”*
rather, it is about the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit. It is about creating systems and cultures that enable organizations to learn from errors, rather than being threatened by them.
Failure is thus a signpost.
Psychologists often make a distinction between mistakes where we already know the right answer and mistakes where we don’t.
On the whole, we will be looking at the first type of failure in the early part of this book and the second type in the latter part.
in both scenarios, error is indispensable to the process of discovery.
In effect, practice is about harnessing the benefits of learning from failure while reducing its cost.