Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do
Rate it:
2%
Flag icon
If there is an accident, the boxes are opened, the data is analyzed, and the reason for the accident excavated. This insures that procedures can be changed so that the same error never happens again. Through this method aviation has attained an impressive safety record.
3%
Flag icon
In health care, however, things are very different.
3%
Flag icon
a million patients are injured by errors during hospital treatment and that 120,000 die each year in America alone.
3%
Flag icon
preventable medical error in hospitals as the third biggest killer in the United States—behind only heart disease and cancer.
3%
Flag icon
Why, then, do so many mistakes happen? One of the problems is complexity. The World Health Organization lists 12,420 diseases and disorders, each of which requires different protocols.
3%
Flag icon
Another problem is scarce resources. Doctors are often overworked and hospitals stretched;
3%
Flag icon
A third issue is that doctors may have to make quick decisions. With serious cases there is rarely sufficient time to consider all the alternative treatments.
3%
Flag icon
a failure to learn from mistakes has been one of the single greatest obstacles to human progress.
4%
Flag icon
A progressive attitude to failure turns out to be a cornerstone of success for any institution.
4%
Flag icon
Society, as a whole, has a deeply contradictory attitude to failure. Even as we find excuses for our own failings, we are quick to blame others who mess up.
4%
Flag icon
Far from learning from mistakes, we edit them out of the official autobiographies we all keep in our own heads.
4%
Flag icon
we need to redefine our relationship with failure, as individuals, as organizations, and as societies.
4%
Flag icon
Only by redefining failure will we unleash progress, creativity, and resilience.
4%
Flag icon
while medicine has evolved rapidly, via an “open loop,” health care (i.e., the institutional question of how treatments are delivered by real people working in complex systems) has not.
4%
Flag icon
for our purposes a closed loop is where failure doesn’t lead to progress because information on errors and weaknesses is misinterpreted or ignored; an open loop does lead to progress because the feedback is rationally acted upon.)
4%
Flag icon
We all want to succeed, whether we are entrepreneurs, sportsmen, politicians, scientists, or parents. But at a collective level, at the level of systemic complexity, success can only happen when we admit our mistakes, learn from them, and create a climate where it is, in a certain sense, “safe” to fail.
6%
Flag icon
Self-justification, allied to a wider cultural allergy to failure, morphs into an almost insurmountable barrier to progress.*
6%
Flag icon
historically, health-care institutions have not routinely collected data on how accidents happen, and so cannot detect meaningful patterns, let alone learn from them. In aviation, on the other hand, pilots are generally open and honest about their own mistakes (crash landings, near misses). The industry has powerful, independent bodies designed to investigate crashes. Failure is not regarded as an indictment of the specific pilot who messes up, but a precious learning opportunity for all pilots, all airlines, and all regulators.
6%
Flag icon
instead of denying failure, or spinning it, aviation learns from failure.
6%
Flag icon
shudder.
7%
Flag icon
Instead of concealing failure, or skirting around it, aviation has a system where failure is data rich. In the event of an accident, investigators, who are independent of the airlines, the pilots’ union, and the regulators, are given full rein to explore the wreckage and to interrogate all other evidence. Mistakes are not stigmatized, but regarded as learning opportunities. The interested parties are given every reason to cooperate, since the evidence compiled by the accident investigation branch is inadmissible in court proceedings. This increases the likelihood of full disclosure. In the ...more
8%
Flag icon
“Learn from the mistakes of others. You can’t live long enough to make them all yourself.”
8%
Flag icon
When pilots experience a near miss with another aircraft, or have been flying at the wrong altitude, they file a report. Providing that it is submitted within ten days, pilots enjoy immunity.
8%
Flag icon
In each case the investigators realized that crews were losing their perception of time. Attention, it turns out, is a scarce resource: if you focus on one thing, you will lose awareness of other things.
8%
Flag icon
another fundamental problem involved communication. Engineer Mendenhall had spotted the fuel problem. He had given a number of hints to the captain and, as the situation became serious, made direct references to the dwindling reserves.
8%
Flag icon
but he couldn’t bring himself to challenge his boss directly.
8%
Flag icon
Social hierarchies inhibit assertiveness. We talk to those in authority in what is called “mitigated language.”
8%
Flag icon
The problem was not a lack of diligence or motivation, but a system insensitive to the limitations of human psychology.
9%
Flag icon
it is about the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit. It is about creating systems and cultures that enable organizations to learn from errors, rather than being threatened by them.
9%
Flag icon
cockpits.
9%
Flag icon
failures are inevitable because the world is complex and we will never fully understand its subtleties. The model, as social scientists often remind us, is not the system. Failure is thus a signpost. It reveals a feature of our world we hadn’t grasped fully and offers vital clues about how to update our models, strategies, and behaviors.
9%
Flag icon
But sometimes mistakes are consciously made as part of a process of discovery. Drug companies test lots of different combinations of chemicals to see which have efficacy and which don’t. Nobody knows in advance which will work and which won’t, but this is precisely why they test extensively, and fail often. It is integral to progress.
9%
Flag icon
error is indispensable to the process of discovery.
10%
Flag icon
practice is about harnessing the benefits of learning from failure while reducing its cost.
10%
Flag icon
The more we can fail in practice, the more we can learn, enabling us to succeed when it really matters.
12%
Flag icon
Checklists originally emerged from a series of crashes in the 1930s. Ergonomic cockpit design was born out of the disastrous series of accidents involving B-17s. Crew Resource Management emerged from the wreckage of United Airlines 173.
12%
Flag icon
This is the paradox of success: it is built upon failure.
12%
Flag icon
Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died . . . We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.
13%
Flag icon
Most closed loops exist because people deny failure or try to spin it. With pseudosciences the problem is more structural. They have been designed, wittingly or otherwise, to make failure impossible. That is why, to their adherents, they are so mesmerizing. They are compatible with everything that happens. But that also means they cannot learn from anything.
13%
Flag icon
Scientists observe nature, create theories, and then seek to prove them by amassing as much supporting evidence as possible. But we can now see that this is only a part of the truth. Science is not just about confirmation, it is also about falsification. Knowledge does not progress merely by gathering confirmatory data, but by looking for contradictory data. Take the hypothesis that water boils at 100ºC. This seems true enough. But, as we now know, the hypothesis breaks down when water is boiled at altitude. By finding the places where a theory fails, we set the stage for the creation of a ...more
13%
Flag icon
Failure, then, is hardwired into both the logic and spirit of scientific progress. Mankind’s most successful discipline has grown by challenging orthodoxy and by subjecting ideas to testing. Individual scientists may sometimes be dogmatic but, as a community, scientists recognize that theories, particularly those at the frontiers of our knowledge, are often fallible or incomplete. It is by testing our ideas, subjecting them to failure, that we set the stage for growth.
13%
Flag icon
an airplane journey represents a kind of hypothesis: namely, that this aircraft, with this design, these pilots, and this system of air traffic control, will reach its destination safely. Each flight represents a kind of test. A crash, in a certain sense, represents a falsification of the hypothesis. That is why accidents have a particular significance in improving system safety, rather as falsification drives science.
13%
Flag icon
Not everyone has the potential to become world champion, but most people can develop mastery with training and application.*
13%
Flag icon
there are many professions where practice and experience do not have any effect. People train for months and sometimes years without improving at all. Research on psychotherapists, for instance, finds that trainees obtain results that are as good as those of licensed “experts.” Similar results have been found with regard to college admissions officers, personnel selectors, and clinical psychologists.* 10
14%
Flag icon
The intuitions of nurses and chess players are constantly checked and challenged by their errors. They are forced to adapt, to improve, to restructure their judgments. This is a hallmark of what is called deliberate practice. For psychotherapists things are radically different. Their job is to improve the mental functioning of their patients. But how can they tell when their interventions are going wrong or, for that matter, right? Where is the feedback? Most psychotherapists gauge how their clients are responding to treatment not with objective data, but by observing them in clinic. But these ...more
14%
Flag icon
Feedback, when delayed, is considerably less effective in improving intuitive judgment.*
14%
Flag icon
If we wish to improve the judgment of aspiring experts, then we shouldn’t just focus on conventional issues like motivation and commitment. In many cases, the only way to drive improvement is to find a way of “turning the lights on.” Without access to the “error signal,” one could spend years in training or in a profession without improving at all.
15%
Flag icon
Those who reported mistakes were surprised to learn that, except in situations in which they had been clearly reckless, they were praised, not punished. Dr. Henry Otero, an oncologist, made a report after being told by a colleague that he had failed to spot the low magnesium level of a patient. “I missed it,” he told a newspaper. “I didn’t know how I missed it. But I realized it’s not about me, it’s about the patient. The process needs to stop me making a mistake.
15%
Flag icon
learning from mistakes has two components. The first is a system. Errors can be thought of as the gap between what we hoped would happen and what actually did happen. Cutting-edge organizations are always seeking to close this gap, but in order to do so they have to have a system geared up to take advantage of these learning opportunities.
15%
Flag icon
Even the most beautifully constructed system will not work if professionals do not share the information that enables it to flourish.
« Prev 1 3 6