More on this book
Community
Kindle Notes & Highlights
by
Matthew Syed
Read between
August 10 - November 21, 2024
we will find that in all these instances the explanation for success hinges, in powerful and often counterintuitive ways, on how we react to failure.
Failure is something we all have to endure from time to time, whether it is the local soccer team losing a match, underperforming at a job interview, or flunking an examination.
For doctors and others working in safety-critical industries, getting it wrong can ...
This highlight has been truncated due to consecutive passage length restrictions.
And that is why a powerful way to begin this investigation, and to glimpse the inextricable connection between failure and success, is to contrast two of the most important safety-critical industries in the world today: health care and aviation. These organizations have differences in psychology, culture, and institutional change, as we...
This highlight has been truncated due to consecutive passage length restrictions.
In 2013 a study published in the Journal of Patient Safety8 put the number of premature deaths associated with preventable harm at more than 400,000 per year.
(Categories of avoidable harm include misdiagnosis, dispensing the wrong drugs, injuring the patient during surgery, operating on the wrong part of the body, improper transfusions, falls, burns, pressure ulcers, and postoperative complications.) Testifying to a Senate hearing in the summer of 2014, Peter J. Pronovost, MD, professor at the Johns Hopkins University School of Medicine and one of the most respected clinicians in the world, pointed out that this is the equivalent of two jumbo jets falling out of the sky every twenty-four hours.
“What these numbers say is that every day, a 747, two of them are crashing. Every two months, 9/11 is occurring,” he said. “We would not tolerate that degr...
This highlight has been truncated due to consecutive passage length restrictions.
preventable medical error in hospitals as the third biggest killer in the United States—behind only heart disease and cancer.
the full death toll due to avoidable error in American health care is more than half a million people per year.10
In the UK the numbers are also alarming. A report by the National Audit Office in 2005 estimated that up to 34,000 people are killed per year due to human error.
They occur most often not when clinicians get bored or lazy or malign, but when they are going about their business with the diligence and concern you would expect from the medical profession.
for reasons both prosaic and profound, a failure to learn from mistakes has been one of the single greatest obstacles to human progress.
Studies have shown that we are often so worried about failure that we create vague goals, so that nobody can point the finger when we don’t achieve them.
And if the failure is a tragedy, such as the death of Elaine Bromiley, learning from failure takes on a moral urgency.
“They learn how to talk about unanticipated outcomes until a ‘mistake’ morphs into a ‘complication.’ Above all, they learn not to tell the patient anything.”
In a different study of 800 patient records in three leading hospitals, researchers found more than 350 medical errors. How many of these mistakes were voluntarily reported by clinicians? Only 4.28
it is about the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit.
It is about creating systems and cultures that enable organizations to learn from errors, rather than being threatened by them.
By finding the places where a theory fails, we set the stage for the creation of a new, more powerful theory: a theory that explains both why water boils at 100ºC at ground level and at a different temperature at altitude. This is the stuff of scientific progress.
It is by testing our ideas, subjecting them to failure, that we set the stage for growth.
These findings have led to the conclusion that expertise is, at least in part, about practice (the so-called 10,000-hour rule).
The intuitions of nurses and chess players are constantly checked and challenged by their errors. They are forced to adapt, to improve, to restructure their judgments. This is a hallmark of what is called deliberate practice.
But there is a deeper problem. Psychotherapists rarely track their clients after therapy has finished. This means that they do not get any feedback on the lasting impact of their interventions. They have no idea if their methods are working or failing—if the client’s long-term mental functioning is actually improving. And that is why the clinical judgments of many practitioners don’t improve over time. They are effectively playing golf in the dark.11
radiologists can’t learn from the error.
If we wish to improve the judgment of aspiring experts, then we shouldn’t just focus on conventional issues like motivation and commitment.
One of his key reforms was to encourage staff to make a report whenever they spotted an error that could harm patients. It was almost identical to the reporting system in aviation and at Toyota. He instituted a twenty-four-hour hotline as well as an online reporting system. He called them Patient Safety Alerts.
The new system represented a huge cultural shift for staff. Mistakes were frowned on at Virginia Mason, just like elsewhere in health care. And because of the steep hierarchy, nurses and junior doctors were fearful of reporting senior colleagues.
To Kaplan’s surprise and disappointment, few reports were made. An enlightened innovation had bombed due to a confl...
This highlight has been truncated due to consecutive passage length restrictions.
Gary Kaplan responded not by evading or spinning, but by publishing a full and frank apology—the opposite of what happened after the death of Elaine Bromiley. “We just can’t say how appalled we are at ourselves,” it read. “You can’t understand something you hide.” The apology was welcomed by relatives and helped them to understand what had happened to a beloved family member.
the death was like a rallying cry,” Kaplan says. “It gave us the cultural push we needed to recognize how serious an issue this is.”
The difference between aviation and health care is sometimes couched in the language of incentives.
When pilots make mistakes, it results in their own deaths. When a doctor makes a mistake, it results in the death of someone else. That is why pilots are better motivated than doctors to reduce mistakes.
in health care, doctors are not supposed to make mistakes. The culture implies that senior clinicians are infallible. Is it any wonder that errors are stigmatized and that the system is set up to ignore and deny rather than investigate and learn?
crash investigators [distill] the information into its practical essence.
But an autopsy allows his colleagues to look inside a body and actually determine the precise cause of death. It is the medical equivalent of a black box.
why conduct an investigation if it might demonstrate that you made a mistake?
the cultural difference between these two institutions is of deep importance
So that others may learn, and even more may live.
when a doctor diagnoses a tumor that isn’t actually there.
an error of commission.
when a doctor fails to diagnose a tumor ...
This highlight has been truncated due to consecutive passage length restrictions.
an error of o...
This highlight has been truncated due to consecutive passage length restrictions.
it is possible to reduce both kinds of error at the same time.
We cannot learn if we close our eyes to inconvenient truths, but we will see that this is precisely what the human mind is wired to do, often in astonishing ways.
“DNA testing is to justice what the telescope is for the stars: not a lesson in biochemistry, not a display of the wonders of magnifying optical glass, but a way to see things as they really are,” Scheck has said. “It is a revelation machine.”7
When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. We simply invent new reasons, new justifications, new explanations. Sometimes we ignore the evidence altogether.
accept that our original judgments may have been at fault.
It forces us to acknowledge that we can sometimes be wrong, even on issues on which we have staked a great deal.
We reframe the evidence. We filter it, we spin it, or ignore it altogether. That way, we can carry on under the comforting assumption that we were right all along.
It is only when we have staked our ego that our mistakes of judgment become threatening. That is when we build defensive walls and deploy cognitive filters.