More on this book
Community
Kindle Notes & Highlights
But even if we practice diligently, we will still endure real-world failure from time to time. And it is often in these circumstances, when failure is most threatening to our ego, that we need to learn most of all.
In effect, the holes in the returning aircraft represented areas where a bomber could take damage and still return home safely.
The missing data is equally important. Or, inversion or a similar thinking process that flips the assumption, question, problem, or conclusion on its head could reveal more accurate or objective insights.
“Learning from failure is anything but straightforward. The attitudes and activities required to effectively detect and analyze failures are in short supply in most companies, and the need for context-specific learning strategies is underappreciated.
Organizations need new and better ways to go beyond lessons that are superficial.”
Now we will have a brief look at success, and our responses to that.
This is the paradox of success: it is built upon failure.
Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died . . . We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.
Take science, a discipline where learning from failure is part of the method.
science progresses through its vigilant response to its own mistakes.
Most closed loops exist because people deny failure or try to spin it.
They are compatible with everything that happens. But that also means they cannot learn from anything.
Science is not just about confirmation, it is also about falsification. Knowledge does not progress merely by gathering confirmatory data, but by looking for contradictory data.
you could observe a million white swans, but this would not prove the proposition: all swans are white.
The observation of a single black swan, on the other hand, would conclusively demonstrate its falsehood.
Each flight represents a kind of test. A crash, in a certain sense, represents a falsification of the hypothesis. That is why accidents have a particular significance in improving
system safety, rather as falsification drives science.
Chess grandmasters
These individuals have practiced not for weeks or months, but often for years.
These findings have led to the conclusion that expertise is, at least in part, about practice (the so-called 10,000-hour rule).
further studies seemed to contradict this finding.
there are many professions where practice and experience do not have any effect.
How can experience be so valuable in some professions but almost worthless in others?
How could
you progress if you don’t have a clue where the ball has landed?
You wouldn’t have any data to improve your accuracy.
Think about being a chess player. When you make a poor move, you are instantly punished by your opponent.
The intuitions of nurses and chess players are constantly checked and challenged by their errors.
They are forced to adapt, to improve, to restructure their judgments. This is a hallmark of what is called deliberate practice.
Feedback, when delayed, is considerably less effective in improving intuitive judgment.*
But notice one final thing: students don’t study these “failed” scientific theories anymore. Why would they?
But this tendency creates a blind spot.
By looking only at the theories that have survived, we don’t notice the failures that made them possible.
It was while at the Toyota plant that he had a revelation. Toyota has a rather unusual production process. If anybody on the production line is having a problem or observes an error, that person pulls a cord that halts production across the plant.
This success is not a one-off or a fluke; it is a method. Properly instituted learning cultures have transformed the performance of hospitals around the world.
learning from mistakes has two components. The first is a system. Errors can be thought of as the gap between what we hoped would happen and what actually did happen.
each system has a basic structure at its heart: mechanisms that guide learning and self-correction.
the most beautifully constructed system will not work if professionals do not share the information that enables it to flourish.
It was only when the mindset of the organization changed that the system started to deliver amazing results.
The difference between aviation and health care is sometimes couched in the language of incentives.
But this analysis misses the crucial point. Remember that pilots died in large numbers in the early days of aviation. This was not because they lacked the incentive to live, but because the system had so many flaws. Failure is inevitable in a complex world.
incentives to improve performance can only have an impact, in many circumstances, if there is a prior understanding of how improvement actually happens.
Unless we alter the way we conceptualize failure, incentives for success can often be impotent.
One particular problem in health care is not just the capacity to learn from mistakes, but also that even when mistakes are detected, the learning opportunities do not flow throughout the system.
called the “adoption rate.”
The problem is not that the information doesn’t exist; rather, it is the way it is formatted.
The reason is more often that the necessary knowledge has not been translated into a simple, usable and systematic form.
[distill] the information into its practical essence.
When the probability of error is high, the importance of learning from mistakes is more essential, not less.
The key issue, however, is not about transferring procedures, but about transferring an attitude.