Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do
Rate it:
Kindle Notes & Highlights
2%
Flag icon
we will find that in all these instances the explanation for success hinges, in powerful and often counterintuitive ways, on how we react to failure.
2%
Flag icon
Failure is something we all have to endure from time to time, whether it is the local soccer team losing a match, underperforming at a job interview, or flunking an examination.
2%
Flag icon
For doctors and others working in safety-critical industries, getting it wrong can ...
This highlight has been truncated due to consecutive passage length restrictions.
2%
Flag icon
And that is why a powerful way to begin this investigation, and to glimpse the inextricable connection between failure and success, is to contrast two of the most important safety-critical industries in the world today: health care and aviation. These organizations have differences in psychology, culture, and institutional change, as we...
This highlight has been truncated due to consecutive passage length restrictions.
3%
Flag icon
In 2013 a study published in the Journal of Patient Safety8 put the number of premature deaths associated with preventable harm at more than 400,000 per year.
3%
Flag icon
(Categories of avoidable harm include misdiagnosis, dispensing the wrong drugs, injuring the patient during surgery, operating on the wrong part of the body, improper transfusions, falls, burns, pressure ulcers, and postoperative complications.) Testifying to a Senate hearing in the summer of 2014, Peter J. Pronovost, MD, professor at the Johns Hopkins University School of Medicine and one of the most respected clinicians in the world, pointed out that this is the equivalent of two jumbo jets falling out of the sky every twenty-four hours.
3%
Flag icon
“What these numbers say is that every day, a 747, two of them are crashing. Every two months, 9/11 is occurring,” he said. “We would not tolerate that degr...
This highlight has been truncated due to consecutive passage length restrictions.
3%
Flag icon
preventable medical error in hospitals as the third biggest killer in the United States—behind only heart disease and cancer.
3%
Flag icon
the full death toll due to avoidable error in American health care is more than half a million people per year.10
3%
Flag icon
In the UK the numbers are also alarming. A report by the National Audit Office in 2005 estimated that up to 34,000 people are killed per year due to human error.
3%
Flag icon
They occur most often not when clinicians get bored or lazy or malign, but when they are going about their business with the diligence and concern you would expect from the medical profession.
3%
Flag icon
for reasons both prosaic and profound, a failure to learn from mistakes has been one of the single greatest obstacles to human progress.
4%
Flag icon
Studies have shown that we are often so worried about failure that we create vague goals, so that nobody can point the finger when we don’t achieve them.
5%
Flag icon
And if the failure is a tragedy, such as the death of Elaine Bromiley, learning from failure takes on a moral urgency.
5%
Flag icon
“They learn how to talk about unanticipated outcomes until a ‘mistake’ morphs into a ‘complication.’ Above all, they learn not to tell the patient anything.”
5%
Flag icon
In a different study of 800 patient records in three leading hospitals, researchers found more than 350 medical errors. How many of these mistakes were voluntarily reported by clinicians? Only 4.28
9%
Flag icon
it is about the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit.
9%
Flag icon
It is about creating systems and cultures that enable organizations to learn from errors, rather than being threatened by them.
13%
Flag icon
By finding the places where a theory fails, we set the stage for the creation of a new, more powerful theory: a theory that explains both why water boils at 100ºC at ground level and at a different temperature at altitude. This is the stuff of scientific progress.
13%
Flag icon
It is by testing our ideas, subjecting them to failure, that we set the stage for growth.
13%
Flag icon
These findings have led to the conclusion that expertise is, at least in part, about practice (the so-called 10,000-hour rule).
14%
Flag icon
The intuitions of nurses and chess players are constantly checked and challenged by their errors. They are forced to adapt, to improve, to restructure their judgments. This is a hallmark of what is called deliberate practice.
14%
Flag icon
But there is a deeper problem. Psychotherapists rarely track their clients after therapy has finished. This means that they do not get any feedback on the lasting impact of their interventions. They have no idea if their methods are working or failing—if the client’s long-term mental functioning is actually improving. And that is why the clinical judgments of many practitioners don’t improve over time. They are effectively playing golf in the dark.11
14%
Flag icon
radiologists can’t learn from the error.
14%
Flag icon
If we wish to improve the judgment of aspiring experts, then we shouldn’t just focus on conventional issues like motivation and commitment.
14%
Flag icon
One of his key reforms was to encourage staff to make a report whenever they spotted an error that could harm patients. It was almost identical to the reporting system in aviation and at Toyota. He instituted a twenty-four-hour hotline as well as an online reporting system. He called them Patient Safety Alerts.
14%
Flag icon
The new system represented a huge cultural shift for staff. Mistakes were frowned on at Virginia Mason, just like elsewhere in health care. And because of the steep hierarchy, nurses and junior doctors were fearful of reporting senior colleagues.
14%
Flag icon
To Kaplan’s surprise and disappointment, few reports were made. An enlightened innovation had bombed due to a confl...
This highlight has been truncated due to consecutive passage length restrictions.
14%
Flag icon
Gary Kaplan responded not by evading or spinning, but by publishing a full and frank apology—the opposite of what happened after the death of Elaine Bromiley. “We just can’t say how appalled we are at ourselves,” it read. “You can’t understand something you hide.” The apology was welcomed by relatives and helped them to understand what had happened to a beloved family member.
15%
Flag icon
the death was like a rallying cry,” Kaplan says. “It gave us the cultural push we needed to recognize how serious an issue this is.”
16%
Flag icon
The difference between aviation and health care is sometimes couched in the language of incentives.
16%
Flag icon
When pilots make mistakes, it results in their own deaths. When a doctor makes a mistake, it results in the death of someone else. That is why pilots are better motivated than doctors to reduce mistakes.
16%
Flag icon
in health care, doctors are not supposed to make mistakes. The culture implies that senior clinicians are infallible. Is it any wonder that errors are stigmatized and that the system is set up to ignore and deny rather than investigate and learn?
17%
Flag icon
crash investigators [distill] the information into its practical essence.
17%
Flag icon
But an autopsy allows his colleagues to look inside a body and actually determine the precise cause of death. It is the medical equivalent of a black box.
17%
Flag icon
why conduct an investigation if it might demonstrate that you made a mistake?
17%
Flag icon
the cultural difference between these two institutions is of deep importance
18%
Flag icon
So that others may learn, and even more may live.
19%
Flag icon
when a doctor diagnoses a tumor that isn’t actually there.
19%
Flag icon
an error of commission.
19%
Flag icon
when a doctor fails to diagnose a tumor ...
This highlight has been truncated due to consecutive passage length restrictions.
19%
Flag icon
an error of o...
This highlight has been truncated due to consecutive passage length restrictions.
19%
Flag icon
it is possible to reduce both kinds of error at the same time.
19%
Flag icon
We cannot learn if we close our eyes to inconvenient truths, but we will see that this is precisely what the human mind is wired to do, often in astonishing ways.
20%
Flag icon
“DNA testing is to justice what the telescope is for the stars: not a lesson in biochemistry, not a display of the wonders of magnifying optical glass, but a way to see things as they really are,” Scheck has said. “It is a revelation machine.”7
21%
Flag icon
When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. We simply invent new reasons, new justifications, new explanations. Sometimes we ignore the evidence altogether.
21%
Flag icon
accept that our original judgments may have been at fault.
21%
Flag icon
It forces us to acknowledge that we can sometimes be wrong, even on issues on which we have staked a great deal.
21%
Flag icon
We reframe the evidence. We filter it, we spin it, or ignore it altogether. That way, we can carry on under the comforting assumption that we were right all along.
22%
Flag icon
It is only when we have staked our ego that our mistakes of judgment become threatening. That is when we build defensive walls and deploy cognitive filters.
« Prev 1 3