More on this book
Community
Kindle Notes & Highlights
Except no one remembered to ask the patient or the emergency medical technicians what the weapon was.
“Your mind doesn’t think of a bayonet in San Francisco,”
The stories surgeons tell one another are often about the shock of the unexpected—the bayonet in San Francisco, the cardiac arrest when all seemed fine—and sometimes about regret over missed possibilities.
The first is ignorance—we may err because science has given us only a partial understanding of the world and how it works.
The second type of failure the philosophers call ineptitude—because in these instances the knowledge exists, yet we fail to apply it correctly.
For nearly all of history, people’s lives have been governed primarily by ignorance.
it is only over the last several decades—science has filled in enough knowledge to make ineptitude as much our struggle as ignorance.
prescribing a statin to lower cholesterol and inflammation,
But now the problem we face is ineptitude, or maybe it’s “eptitude”—making sure we apply the knowledge we have consistently and correctly.
Getting the steps right is proving brutally hard, even if you know them.
If the knowledge of the best thing to do in a given situation does not exist, we are happy to have people simply make their best effort. But if the knowledge exists and is not applied correctly, it is difficult not to be infuriated.
It is not for nothing that the philosophers gave these failures so unmerciful a name—ineptitude. Those on the receiving end use other words, like negligence or even heartlessness.
And defeat under conditions of complexity occurs far more often despite great effort rather than from a lack of it. That’s why the traditional solution in most professions has not been to punish failure but instead to encourage more experience and training.
the volume and complexity of what we know has exceeded our individual ability to deliver its benefits correctly, safely, or reliably. Knowledge has both saved us and burdened us.
we need a different strategy for overcoming failure, one that builds on experience and takes advantage of the knowledge people have but somehow also makes up for our inevitable human inadequacies.
It is a checklist.
Her pupils were dilated and unreactive to light, indicating cessation of brain function.
Medicine has become the art of managing extreme complexity—and a test of whether such complexity can, in fact, be humanly mastered.
World Health Organization’s international classification of diseases has grown to distinguish more than thirteen thousand different diseases, syndromes, and types of injury—more than thirteen thousand different ways, in other words, that the body can fail. And, for nearly all of them, science has given us things we can do to help. If we cannot cure the disease, then we can usually reduce the harm and misery it causes.
But extreme complexity is the rule for almost everyone.
One of the most common diagnoses, it turned out, was “Other.” On a hectic day, when you’re running two hours behind and the people in the waiting room are getting irate, you may not take the time to record the precise diagnostic codes in the database. But, even when you do have the time, you commonly find that the particular diseases your patients have do not actually exist in the computer system.
Scientists continue to report important new genetic findings, subtypes of cancer, and other diagnoses—not to mention treatments—almost weekly. The complexity is increasing so fast that even the computers cannot keep up.
a substantial part of the credit goes to the abilities intensive care units have developed to take artificial control of failing bodies.
Critical care has become an increasingly large portion of what hospitals do.
The average stay of an ICU patient is four days, and the survival rate is 86 percent.
Going into
an ICU, being put on a mechanical ventilator, having tubes and wires run into and out of you,...
This highlight has been truncated due to consecutive passage length restrictions.
Fifteen years ago, Israeli scientists published a study in which engineers observed patient care in ICUs for twenty-four-hour stretches. They found that the average patient required 178 individual actions per day, ranging from administering a drug to suctioning the lungs, and every one of them posed risks. Remarkably, the nurses and doctors were observed to make an error in just 1 p...
This highlight has been truncated due to consecutive passage length restrictions.
brush their teeth twice a day to avoid pneumonia from bacterial buildup in their mouths.
This is the reality of intensive care: at any point, we are as apt to harm as we are to heal.
All in all, about half of ICU patients end up experiencing a serious complication, and once that occurs the chances of survival drop sharply.
Here, then, is the fundamental puzzle of modern medical care: you have a desperately sick patient and in order to have a chance of saving him you have to get the knowledge right and then you have to make sure that the 178 daily tasks that follow are done correctly—despite some monitor’s alarm going off for God knows what reason, despite the patient in the next bed crashing, despite a nurse poking his head around the curtain to ask whether someone could help “get this lady’s chest open.”
There is complexity upon complexity.
Expertise is the mantra of modern medicine.
We live in the era of the superspecialist—of clinicians who have taken the time to practice, practice, practice at one narrow thing until they can do it better than anyone else. They have two advantages over ordinary specialists: greater knowledge of the details that matter and a learned ability to handle the complexities of the particular job. There are degrees of complexity, though, and medicine and other fields like it have grown so far beyond the usual kind that avoiding daily mistakes is proving impossible even for our most superspecialized.
Surgeons are so absurdly ultraspecialized that when we joke about right ear surgeons and left ear surgeons,
Americans today undergo an average of seven operations in their lifetime, with surgeons performing more than fifty million operations annually—the amount of harm remains substantial.
Instead, they came up with an ingeniously simple approach: they created a pilot’s checklist.
The test pilots made their list simple, brief, and to the point—short enough to fit on an index card, with step-by-step checks for takeoff, flight, landing, and taxiing. It had the kind of stuff that all pilots know to do.
But we believe our jobs are too complicated to reduce to a checklist.
The first is the fallibility of human memory and attention, especially when it comes to mundane, routine matters that are easily overlooked under the strain of more pressing events.
Faulty memory and distraction are a particular danger in what engineers call all-or-none processes: whether running to the store to buy ingredients for a cake, preparing an airplane for takeoff, or evaluating a sick person in the hospital, if you miss just one key thing, you might as well not have made the effort at all.
A further difficulty, just as insidious, is that people can lull themselves into skipping steps even when they remember them. In complex processes, after all, certain steps don’t always matter.
Checklists seem to provide protection against such failures. They remind us of the minimum necessary steps and make them explicit. They not only offer the possibility of verification but also instill a kind of discipline of higher performance.
These steps are no-brainers; they have been known and taught for years. So it seemed silly to make a checklist for something so obvious.
The researchers found that simply having the doctors and nurses in the ICU create their own checklists for what they thought should be done each day improved the consistency of care to the point that the average length of patient stay in intensive care dropped by half. These checklists accomplished what checklists elsewhere have done, Pronovost observed. They helped with memory recall and clearly set out the minimum necessary steps in a process.
Checklists, he found, established a higher standard of baseline performance.
It was hard to imagine that they could get their heads far enough above the daily tide of disasters to worry about the minutiae on some checklist.
The successes have been sustained for several years now—all because of a stupid little checklist.
a lesson is emerging: checklists seem able to defend anyone, even the experienced, against failure in many more tasks than we realized. They provide a kind of cognitive net. They catch mental flaws inherent in all of us—flaws of memory and attention and thoroughness. And because they do, they raise wide, unexpected possibilities.