Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do
Rate it:
10%
Flag icon
But even if we practice diligently, we will still endure real-world failure from time to time. And it is often in these circumstances, when failure is most threatening to our ego, that we need to learn most of all.
11%
Flag icon
In effect, the holes in the returning aircraft represented areas where a bomber could take damage and still return home safely.
Matthew Ackerman
The missing data is equally important. Or, inversion or a similar thinking process that flips the assumption, question, problem, or conclusion on its head could reveal more accurate or objective insights.
11%
Flag icon
“Learning from failure is anything but straightforward. The attitudes and activities required to effectively detect and analyze failures are in short supply in most companies, and the need for context-specific learning strategies is underappreciated.
11%
Flag icon
Organizations need new and better ways to go beyond lessons that are superficial.”
11%
Flag icon
Now we will have a brief look at success, and our responses to that.
12%
Flag icon
This is the paradox of success: it is built upon failure.
12%
Flag icon
Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died . . . We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.
12%
Flag icon
Take science, a discipline where learning from failure is part of the method.
12%
Flag icon
science progresses through its vigilant response to its own mistakes.
13%
Flag icon
Most closed loops exist because people deny failure or try to spin it.
13%
Flag icon
They are compatible with everything that happens. But that also means they cannot learn from anything.
13%
Flag icon
Science is not just about confirmation, it is also about falsification. Knowledge does not progress merely by gathering confirmatory data, but by looking for contradictory data.
13%
Flag icon
you could observe a million white swans, but this would not prove the proposition: all swans are white.
13%
Flag icon
The observation of a single black swan, on the other hand, would conclusively demonstrate its falsehood.
13%
Flag icon
Each flight represents a kind of test. A crash, in a certain sense, represents a falsification of the hypothesis. That is why accidents have a particular significance in improving
13%
Flag icon
system safety, rather as falsification drives science.
13%
Flag icon
Chess grandmasters
13%
Flag icon
These individuals have practiced not for weeks or months, but often for years.
13%
Flag icon
These findings have led to the conclusion that expertise is, at least in part, about practice (the so-called 10,000-hour rule).
13%
Flag icon
further studies seemed to contradict this finding.
13%
Flag icon
there are many professions where practice and experience do not have any effect.
13%
Flag icon
How can experience be so valuable in some professions but almost worthless in others?
13%
Flag icon
How could
13%
Flag icon
you progress if you don’t have a clue where the ball has landed?
13%
Flag icon
You wouldn’t have any data to improve your accuracy.
13%
Flag icon
Think about being a chess player. When you make a poor move, you are instantly punished by your opponent.
14%
Flag icon
The intuitions of nurses and chess players are constantly checked and challenged by their errors.
14%
Flag icon
They are forced to adapt, to improve, to restructure their judgments. This is a hallmark of what is called deliberate practice.
14%
Flag icon
Feedback, when delayed, is considerably less effective in improving intuitive judgment.*
14%
Flag icon
Without access to the “error signal,” one could spend years in training or in a profession without improving at all.
Matthew Ackerman
System should close feedback loop quickly
14%
Flag icon
But notice one final thing: students don’t study these “failed” scientific theories anymore. Why would they?
14%
Flag icon
But this tendency creates a blind spot.
14%
Flag icon
By looking only at the theories that have survived, we don’t notice the failures that made them possible.
14%
Flag icon
It was while at the Toyota plant that he had a revelation. Toyota has a rather unusual production process. If anybody on the production line is having a problem or observes an error, that person pulls a cord that halts production across the plant.
15%
Flag icon
This success is not a one-off or a fluke; it is a method. Properly instituted learning cultures have transformed the performance of hospitals around the world.
15%
Flag icon
learning from mistakes has two components. The first is a system. Errors can be thought of as the gap between what we hoped would happen and what actually did happen.
15%
Flag icon
each system has a basic structure at its heart: mechanisms that guide learning and self-correction.
15%
Flag icon
the most beautifully constructed system will not work if professionals do not share the information that enables it to flourish.
15%
Flag icon
It was only when the mindset of the organization changed that the system started to deliver amazing results.
16%
Flag icon
The difference between aviation and health care is sometimes couched in the language of incentives.
16%
Flag icon
But this analysis misses the crucial point. Remember that pilots died in large numbers in the early days of aviation. This was not because they lacked the incentive to live, but because the system had so many flaws. Failure is inevitable in a complex world.
16%
Flag icon
incentives to improve performance can only have an impact, in many circumstances, if there is a prior understanding of how improvement actually happens.
16%
Flag icon
Unless we alter the way we conceptualize failure, incentives for success can often be impotent.
16%
Flag icon
One particular problem in health care is not just the capacity to learn from mistakes, but also that even when mistakes are detected, the learning opportunities do not flow throughout the system.
16%
Flag icon
called the “adoption rate.”
17%
Flag icon
The problem is not that the information doesn’t exist; rather, it is the way it is formatted.
17%
Flag icon
The reason is more often that the necessary knowledge has not been translated into a simple, usable and systematic form.
17%
Flag icon
[distill] the information into its practical essence.
17%
Flag icon
When the probability of error is high, the importance of learning from mistakes is more essential, not less.
17%
Flag icon
The key issue, however, is not about transferring procedures, but about transferring an attitude.