Black Box Thinking: Why Some People Never Learn from Their Mistakes - But Some Do
Rate it:
12%
Flag icon
Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died . . . We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.
17%
Flag icon
The reason . . . is not usually laziness or unwillingness. The reason is more often that the necessary knowledge has not been translated into a simple, usable and systematic form. If the only thing people did in aviation was issue dense, pages-long bulletins . . . it would be like subjecting pilots to the same deluge of almost 700,000 medical journal articles per year that clinicians must contend with. The information would be unmanageable. Instead . . . crash investigators [distill] the information into its practical essence.
22%
Flag icon
The more we have riding on our judgments, the more we are likely to manipulate any new evidence that calls them into question.
28%
Flag icon
And this takes us back to perhaps the most paradoxical aspect of cognitive dissonance. It is precisely those thinkers who are most renowned, who are famous for their brilliant minds, who have the most to lose from mistakes. And that is why it is often the most influential people, those who ought to be in the best position to help the world learn from new evidence, who have the greatest incentive to reframe it. And these are also the kinds of people (or institutions) who often have the capacity to employ expensive PR firms to bolster their post hoc justifications. They have the financial means, ...more
29%
Flag icon
In his seminal book, Why Smart Executives Fail: And What You Can Learn from Their Mistakes, Sydney Finkelstein, a management professor at Dartmouth College, investigated major failures at more than fifty corporate institutions.11 He found that error-denial increases as you go up the pecking order. Ironically enough, the higher people are in the management hierarchy, the more they tend to supplement their perfectionism with blanket excuses, with CEOs usually being the worst of all. For example, in one organization we studied, the CEO spent the entire forty-five-minute interview explaining all ...more
34%
Flag icon
Now, judges are supposed to be rational and deliberative. They are supposed to make decisions on hard evidence. But Danziger found something quite different: if the case was assessed by a judge just after he had eaten breakfast, the prisoner had a 65 percent chance of getting parole. But as time passed through the morning, and the judges got hungry, the chances of parole gradually diminished to zero. Only after the judges had taken a break to eat did the odds shoot back up to 65 percent, only to decrease back to 0 over the course of the afternoon. The judges were oblivious to this astonishing ...more
49%
Flag icon
Note the similarity of the final quote of Duflo with that of Brailsford earlier in this chapter. “The whole approach comes from the idea that if you break down a big goal into small parts, and then improve on each of them, you will gain a huge increase when you put them all together.”
55%
Flag icon
As the neuroscientist David Eagleman says in his book Incognito: The Secret Lives of the Brain: “When an idea is served up from behind the scenes, the neural circuitry has been working on the problems for hours or days or years, consolidating information and trying out new combinations. But you merely take credit without further wonderment at the vast, hidden political machinery behind the scenes.”
55%
Flag icon
conference table.”9 And this helps to explain why cities are so creative, why atriums are important; in fact why any environment that allows disparate people, and therefore ideas, to bump into each other, is so conducive. They facilitate the association of diverse ideas, and bring people face-to-face with dissent and criticism. All help to ignite creativity.
60%
Flag icon
Think of it like this: if our first reaction is to assume that the person closest to a mistake has been negligent or malign, then blame will flow freely and the anticipation of blame will cause people to cover up their mistakes. But if our first reaction is to regard error as a learning opportunity, then we will be motivated to investigate what really happened.
64%
Flag icon
As the philosopher Karl Popper put it: “True ignorance is not the absence of knowledge, but the refusal to acquire it.”
77%
Flag icon
The irony is that the social world is more complex than the natural world. We have general theories predicting the movement of the planets, but no general theories of human behavior. As we progress from physics, through chemistry and biology, out to economics, politics, and business, coming up with solutions becomes more difficult. But this strengthens rather than weakens the imperative of learning from failure.
77%
Flag icon
Free markets are successful, in large part because of their capacity to clock up thousands of useful failures. Centrally planned economies are ineffective, on the other hand, because they lack this capacity. Markets, like other evolutionary systems, offer an antidote to our ignorance. They are not perfect, and often need government intervention to work properly. But well-functioning markets succeed because of a vital ingredient: adaptability. Different companies trying different things, with some failing and some surviving, add to the pool of knowledge. Cognitive dissonance is thwarted, in the ...more
77%
Flag icon
John Stuart Mill, the British philosopher, wrote about the importance of “experiments in living.” He based his defense of freedom not on an abstract value, but upon the recognition that civil society also needs trial and error. Social conformity, he argued, is catastrophic because it limits experimentation (it is the sociological equivalent of deference to authority). Criticism and dissent, far from being dangerous to the social order, are central to it. They drive new ideas and fire creativity.*
79%
Flag icon
Another “failure based” technique, which has come into vogue in recent years, is the so-called pre-mortem. With this method a team is invited to consider why a plan has gone wrong before it has even been put into action. It is the ultimate “fail fast” technique. The idea is to encourage people to be open about their concerns, rather than hiding them out of fear of sounding negative.
79%
Flag icon
A pre-mortem typically starts with the leader asking everyone in the team to imagine that the project has gone horribly wrong and to write down the reasons why on a piece of paper. He or she then asks everyone to read a single reason from the list, starting with the project manager, before going around the table again.