More on this book
Community
Kindle Notes & Highlights
“necessary fallibility”
In such realms, Gorovitz and MacIntyre point out, we have just two reasons that we may nonetheless fail.
we cannot predict, heart attacks we still haven’t learned how to stop. The second type of failure the philosophers call ineptitude—because in these instances the knowledge exists, yet we fail to apply it correctly.
We knew little about what caused them or what could be done to remedy them. But sometime over the last several decades—and it is only over the last several decades—science has filled in enough knowledge to make ineptitude as much our struggle as ignorance.
They persist despite remarkable individual ability.
Here, then, is our situation at the start of the twenty-first century: We have accumulated stupendous know-how. We have put it in the hands of some of the most highly trained, highly skilled, and hardworking people in our society. And, with it, they have indeed accomplished extraordinary things. Nonetheless, that know-how is often unmanageable. Avoidable failures are common and persistent, not to mention demoralizing and frustrating, across many fields—from medicine to finance, business to government. And the reason is increasingly evident: the volume and complexity of what we know has
...more
Intensive care succeeds only when we hold the odds of doing harm low enough for the odds of doing good to prevail.
Here, then, is the fundamental puzzle of modern medical care: you have a desperately sick patient and in order to have a chance of saving him you have to get the knowledge right and then you have to make sure that the 178 daily tasks that follow are done correctly—despite some monitor’s alarm going off for God knows what reason, despite the patient in the next bed crashing, despite a nurse poking his head around the curtain to ask whether someone could help “get this lady’s chest open.”
bankrupt.
But perhaps the fourth reveals a fever or low blood pressure or a galloping heart rate, and skipping it could cost a person her life.
In a complex environment, experts are up against two main difficulties. The first is the fallibility of human memory and attention, especially when it comes to mundane, routine matters that are easily overlooked under the strain of more pressing events. (When you’ve got a patient throwing up and an upset family member asking you what’s going on, it can be easy to forget that you have not checked her pulse.) Faulty memory and distraction are a particular danger in what engineers call all-or-none processes: whether running to the store to buy ingredients for a cake, preparing an airplane for
...more
They remind us of the minimum necessary steps and make them explicit. They not only offer the possibility of verification but also instill a kind of discipline of higher performance. Which is precisely what happened with vital signs—though it was not doctors who deserved the credit.
In more than a third of patients, they skipped at least one.
They calculated that, in this one hospital, the checklist had prevented forty-three infections and eight deaths and saved two million dollars in costs.
The researchers found that simply having the doctors and nurses in the ICU create their own checklists for what they thought should be done each day improved the consistency of care to the point that the average length of patient stay in intensive care dropped by half.
Two professors who study the science of complexity—Brenda Zimmerman of York University and Sholom Glouberman of the University of Toronto—have proposed a distinction among three different kinds of problems in the world: the simple, the complicated, and the complex.
We are besieged by simple problems.
In the face of the unknown—the always nagging uncertainty about whether, under complex circumstances, things will really be okay—the builders trusted in the power of communication. They didn’t believe in the wisdom of the single individual, of even an experienced engineer. They believed in the wisdom of the group, the wisdom of making sure that multiple pairs of eyes were on a problem and then letting the watchers decide what to do.
Man is fallible, but maybe men are less so.
In response to risk, most authorities tend to centralize power and decision making.
The philosophy is that you push the power of decision making out to the periphery and away from the center. You give people the room to adapt, based on their experience and expertise. All you ask is that they talk to one another and take responsibility. That is what works.
Everyone was waiting for the cavalry, but a centrally run, government-controlled solution was not going to be possible.
lot of you are going to have to make decisions above your level. Make the best decision that you can with the information that’s available to you at the time, and, above all, do the right thing.”
No, the real lesson is that under conditions of true complexity—where the knowledge required exceeds that of any individual and unpredictability reigns—efforts to dictate every step from the center will fail. People need room to act and adapt. Yet they cannot succeed as isolated individuals, either—that is anarchy. Instead, they require a seemingly contradictory mix of freedom and expectation—expectation to coordinate, for example, and also to measure progress toward common goals.
would sink in, or the doors weren’t big enough to move the gear through. The contract rider read like a version of the Chinese Yellow Pages because there was so much equipment, and so many human beings to make it function.” So just as a little test, buried somewhere in the middle of the rider, would be article 126, the no-brown-M&M’s clause. “When I would walk backstage, if I saw a brown M&M in that bowl,” he wrote, “well, we’d line-check the entire production. Guaranteed you’re going to arrive at a technical error.… Guaranteed you’d run into a problem.” These weren’t trifles, the radio story
...more
The secret, he pointed out to me, was that the soap was more than soap. It was a behavior-change delivery vehicle.
At the start of the study, the average number of bars of soap households used was not zero. It was two bars per week. In other words, they already had soap.
There was a check box for the nurse to verbally confirm with the team that they had the correct patient and the correct side of the body planned for surgery—something teams are supposed to verify in any case.
Likewise, a group of Southern California hospitals within the Kaiser health care system had studied a thirty-item “surgery preflight checklist”
the more familiar and widely dangerous issue is a kind of silent disengagement, the consequence of specialized technicians sticking narrowly to their domains.
“That’s not my problem” is possibly the worst thing people can think, whether they are starting an operation, taxiing an airplane full of passengers down a runway, or building a thousand-foot-tall skyscraper. But in medicine, we see it all the time. I’ve seen it in my own operating room.
They can each be technical masters at what they do. That’s what we train them to be, and that alone can take years. But the evidence suggests we need them to see their job not just as performing their isolated set of tasks well but also as helping the group get the best possible results. This requires finding a way to ensure that the group lets nothing fall between the cracks and also adapts as a team to whatever problems might arise.
The researchers called it an “activation phenomenon.” Giving people a chance to say something at the start seemed to activate their sense of participation and responsibility and their willingness to speak up.
You must define a clear pause point at which the checklist is supposed to be used
It is common to misconceive how checklists function in complex lines of work. They are not comprehensive how-to guides, whether for building a skyscraper or getting a plane out of trouble. They are quick and simple tools aimed to buttress the skills of expert professionals. And by remaining swift and usable and resolutely modest, they are saving thousands upon thousands of lives.
How this happened—it involved a checklist, of course—is instructive. But first think about what happens in most lines of professional work when a major failure occurs. To begin with, we rarely investigate our failures. Not in medicine, not in teaching, not in the legal profession, not in the financial world, not in virtually any other kind of work where the mistakes do not turn up on cable news. A single type of error can affect thousands, but because it usually touches only one person at a time, we tend not to search as hard for explanations.
right? Not necessarily. Just ticking boxes is not the ultimate goal here. Embracing a culture of teamwork and discipline is.
The work of medicine is too intricate and individual for that: good clinicians will not be able to dispense with expert audacity. Yet we should also be ready to accept the virtues of regimentation.
So Pabrai made a list of mistakes he’d seen—ones Buffett and other investors had made as well as his own. It soon contained dozens of different mistakes, he said. Then, to help him guard against them, he devised a matching list of checks—about seventy in all.
“they improve their outcomes with no increase in skill. That’s what we are doing when we use the checklist.”
they improve their outcomes with no increase in skill. That’s what we are doing when we use the checklist.”
And by adhering to this discipline—by taking just those few short minutes—they not only made sure the plane was fit to travel but also transformed themselves from individuals into a team, one systematically prepared to handle whatever came their way.
The checklist gets the dumb stuff out of the way, the routines your brain shouldn’t have to occupy itself with (Are the elevator controls set? Did the patient get her antibiotics on time? Did the managers sell all their shares? Is everyone on the same page here?), and lets it rise above to focus on the hard stuff (Where should we land?).
This is what it means to be a hero in the modern era. These are the rare qualities that we must understand are needed in the larger world.
But we could, and that is the ultimate point. We are all plagued by failures—by missed subtleties, overlooked knowledge, and outright errors. For the most part, we have imagined that little can be done beyond working harder and harder to catch the problems and clean up after them.
When we look closely, we recognize the same balls being dropped over and over, even by those of great ability and determination. We know the patterns. We see the costs. It’s time to try something else. Try a checklist.