Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do
Rate it:
2%
Flag icon
we will find that in all these instances the explanation for success hinges, in powerful and often counterintuitive ways, on how we react to failure.
2%
Flag icon
contrast two of the most important safety-critical industries in the world today: health care and aviation.
2%
Flag icon
the most profound difference is in their divergent approaches to failure.
2%
Flag icon
Every aircraft is equipped with two almost-indestructible black boxes, one of which records instructions sent to the onboard electronic systems, and another that records the conversations and sounds in the cockpit.*
3%
Flag icon
This insures that procedures can be changed so that the same error never happens again.
3%
Flag icon
In health care, however, things are very different.
3%
Flag icon
published a landmark investigation called “To Err Is Human.” It reported that between 44,000 and 98,000 Americans die each year as a result of preventable medical errors.
3%
Flag icon
Journal of Patient Safety8 put the number of premature deaths associated with preventable harm at more than 400,000 per year.
3%
Flag icon
These figures place preventable medical error in hospitals as the third biggest killer in the United States—behind only heart disease and cancer.
3%
Flag icon
The problem is not a small group of crazy, homicidal, incompetent doctors going around causing havoc. Medical errors follow a normal bell-shaped distribution.
3%
Flag icon
Why, then, do so many mistakes happen? One of the problems is complexity.
3%
Flag icon
Another problem is scarce resources.
3%
Flag icon
A third issue is that doctors may have to make quick decisions.
3%
Flag icon
there is also something deeper and more subtle at work, something that has little to do with resources, and everything to do with culture.
3%
Flag icon
errors committed in hospitals (and in other areas of life) have particular trajectories, subtle but predictable patterns: what accident investigators call “signatures.”
3%
Flag icon
a failure to learn from mistakes has been one of the single greatest obstacles to human progress.
4%
Flag icon
A progressive attitude to failure turns out to be a cornerstone of success for any institution.
4%
Flag icon
Society, as a whole, has a deeply contradictory attitude to failure.
4%
Flag icon
we are quick to blame others who mess up.
4%
Flag icon
We have a deep instinct to find scapegoats.
4%
Flag icon
this has recursive effects, as we shall see. It is partly because we are so willing to blame others for their mistakes that we are so keen to conceal our own.
4%
Flag icon
The net effect is simple: it obliterates openness and spawns cover-ups. It destroys the vital information we need in order to learn.
4%
Flag icon
Studies have shown that we are often so worried about failure that we create vague goals,
4%
Flag icon
We come up with face-saving excuses, even before we have attempted anything.
4%
Flag icon
We cover up mistakes, not only to protect ourselves from others, but to protect us from ourselves.
4%
Flag icon
This basic perspective—that failure is profoundly negative, something to be ashamed of in ourselves and judgmental about in others—has deep cultural and psychological roots.
4%
Flag icon
The purpose of this book is to offer a radically different perspective.
4%
Flag icon
we need to redefine our relationship with failure,
4%
Flag icon
redefining failure will we unleash progress, creativity, and resilience.
4%
Flag icon
the idea of a “closed loop,”
4%
Flag icon
they never subjected the treatment to a proper test—and so they never detected failure.
4%
Flag icon
Doctors were effectively killing patients for the better part of 1,700 years not because they lacked intelligence or compassion, but because they did not recognize the flaws in their own procedures.
4%
Flag icon
for our purposes a closed loop is where failure doesn’t lead to progress because information on errors and weaknesses is misinterpreted or ignored; an open loop does lead to progress because the feedback is rationally acted upon.)
6%
Flag icon
He realized that the mistake may have had a “signature,” a subtle pattern that, if acted upon, could save future lives. The doctors in charge of the operation couldn’t have known this for a simple but devastating reason: historically, health-care institutions have not routinely collected data on how accidents happen, and so cannot detect meaningful patterns, let alone learn from them.
6%
Flag icon
our attitude to progress: instead of denying failure, or spinning it, aviation learns from failure.
7%
Flag icon
what is important for our purposes is not the similarity between the two accidents; it is the difference in response.
7%
Flag icon
In aviation, things are radically different: learning from failure is hardwired into the system.
7%
Flag icon
The interested parties are given every reason to cooperate, since the evidence compiled by the accident investigation branch is inadmissible in court proceedings.
Matthew Ackerman
Interesting. Incentive aligned with desired outcome. Built into system.
7%
Flag icon
In the aftermath of the investigation the report is made available to everyone. Airlines have a legal responsibility to implement the recommendations. Every pilot in the world has free access to the data.
8%
Flag icon
“Learn from the mistakes of others.
8%
Flag icon
You can’t live long enough to make them all yourself.”
8%
Flag icon
so, too, do “small...
This highlight has been truncated due to consecutive passage length restrictions.
8%
Flag icon
individuals are not intimidated about admitting to errors because they recognize their value.
Matthew Ackerman
Built into the culture
9%
Flag icon
This, then, is what we might call “black box thinking.”*
9%
Flag icon
rather, it is about the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit. It is about creating systems and cultures that enable organizations to learn from errors, rather than being threatened by them.
9%
Flag icon
Failure is thus a signpost.
9%
Flag icon
Psychologists often make a distinction between mistakes where we already know the right answer and mistakes where we don’t.
9%
Flag icon
On the whole, we will be looking at the first type of failure in the early part of this book and the second type in the latter part.
9%
Flag icon
in both scenarios, error is indispensable to the process of discovery.
10%
Flag icon
In effect, practice is about harnessing the benefits of learning from failure while reducing its cost.
« Prev 1 3 7