Black Box Thinking: Growth Mindset and the Secrets of High Performance
Rate it:
4%
Flag icon
a failure to learn from mistakes has been one of the single greatest obstacles to human progress.
4%
Flag icon
It is partly because we are so willing to blame others for their mistakes that we are so keen to conceal our own. We anticipate, with remarkable clarity, how people will react, how they will point the finger, how little time they will take to put themselves in the tough, high-pressure situation in which the error occurred. The net effect is simple: it obliterates openness and spawns cover-ups. It destroys the vital information we need in order to learn.
4%
Flag icon
closed loop is where failure doesn’t lead to progress because information on errors and weaknesses is misinterpreted or ignored; an open loop does lead to progress because the feedback is rationally acted upon).
8%
Flag icon
‘Learn from the mistakes of others. You can’t live long enough to make them all yourself.’
9%
Flag icon
The problem was not a lack of diligence or motivation, but a system insensitive to the limitations of human psychology.
9%
Flag icon
That is one of the ways that closed loops perpetuate: when people don’t interrogate errors, they sometimes don’t even know they have made one (even if they suspect they may have).
9%
Flag icon
“human errors” often emerge from poorly designed systems.
9%
Flag icon
It is about creating systems and cultures that enable organisations to learn from errors, rather than being threatened by them.
10%
Flag icon
In effect, practice is about harnessing the benefits of learning from failure while reducing its cost. It is better to fail in practice in preparation for the big stage than on the big stage itself. This is true of organisations, too, which conduct pilot schemes (and in the case of aviation and other safety critical industries test ideas in simulators) in order to learn, before rolling out new ideas or procedures. The more we can fail in practice, the more we can learn, enabling us to succeed when it really matters.
11%
Flag icon
The observable bullet holes suggested that the area around the cockpit and tail didn’t need reinforcing because it was never hit. In fact, the planes that were hit in these places were crashing because this is where they were most vulnerable.
12%
Flag icon
This is the paradox of success: it is built upon failure.
12%
Flag icon
Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died . . . We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.
13%
Flag icon
With pseudosciences the problem is more structural. They have been designed, wittingly or otherwise, to make failure impossible. That is why, to their adherents, they are so mesmerising. They are compatible with everything that happens. But that also means they cannot learn from anything.
13%
Flag icon
Knowledge does not progress merely by gathering confirmatory data, but by looking for contradictory data.
14%
Flag icon
Feedback, when delayed, is considerably less effective in improving intuitive judgement.
15%
Flag icon
This success is not a one-off or a fluke, it is a method.
15%
Flag icon
learning from mistakes has two components. The first is a system.
15%
Flag icon
Mechanisms designed to learn from mistakes are impotent in many contexts if people won’t admit to them. It was only when the mindset of the organisation changed that the system started to deliver amazing results.
16%
Flag icon
One particular problem in healthcare is not just the capacity to learn from mistakes, but also that even when mistakes are detected, the learning opportunities do not flow throughout the system. This is sometimes called the ‘adoption rate’.
17%
Flag icon
When the probability of error is high, the importance of learning from mistakes is more essential, not less.
20%
Flag icon
Systems that do not engage with failure struggle to learn.
21%
Flag icon
When Prophecy Fails,14 they simply redefined the failure.
21%
Flag icon
When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. We simply invent new reasons, new justifications, new explanations. Sometimes we ignore the evidence altogether.
21%
Flag icon
‘Cognitive dissonance’ is the term Festinger coined to describe the inner tension we feel when, among other things, our beliefs are challenged by evidence.
25%
Flag icon
Lying to oneself destroys the very possibility of learning.
28%
Flag icon
It hints at the suspicion that the intellectual energy of some of the world’s most formidable thinkers is directed, not at creating new, richer, more explanatory theories, but at coming up with ever-more tortuous rationalisations as to why they were right all along.
30%
Flag icon
The pattern is rarely uncovered unless subjects are willing to make mistakes – that is, to test numbers that violate their belief. Instead most people get stuck in a narrow and wrong hypothesis, as often happens in real life, such that their only way out is to make a mistake that turns out not to be a mistake after all. Sometimes, committing errors is not just the fastest way to the correct answer; it’s the only way.
35%
Flag icon
So far in the book, we have seen that learning from mistakes relies on two components: first, you need to have the right kind of system – one that harnesses errors as a means of driving progress; and second, you need a mindset that enables such a system to flourish.
35%
Flag icon
Cognitive dissonance occurs when mistakes are too threatening to admit to, so they are reframed or ignored. This can be thought of as the internal fear of failure: how we struggle to admit mistakes to ourselves.
36%
Flag icon
an adaptive process driven by the detection and response to failure.
39%
Flag icon
the dangers of ‘perfectionism’: of trying to get things right first time.
39%
Flag icon
The desire for perfection rests upon two fallacies. The first resides in the miscalculation that you can create the optimal solution sitting in a bedroom or ivory tower and thinking things through rather than getting out into the real world and testing assumptions, thus finding their flaws. It is the problem of valuing top-down over bottom-up. The second fallacy is the fear of failure.
41%
Flag icon
Take a policy as simple as reducing the dangers of smoking by cutting tar and nicotine in cigarettes. It sounds great in theory, particularly when used in conjunction with a clever marketing campaign. It looks like a ballistic strategy perfectly designed to hit an important public health target. But when this idea was implemented in practice, it failed. Smokers compensated for the lack of nicotine by smoking more cigarettes and taking longer and deeper drags. The net result was an increase in carcinogens and carbon monoxide.
44%
Flag icon
Often, failure is clouded in ambiguity. What looks like success may really be failure and vice versa.
48%
Flag icon
‘It is about marginal gains,’ he said. ‘The approach comes from the idea that if you break down a big goal into small parts, and then improve on each of them, you will deliver a huge increase when you put them all together.’
50%
Flag icon
‘When I first started in F1, we recorded eight channels of data. Now we have 16,000 from every single parameter on the car. And we derive another 50,000 channels from that data,’ said Paddy Lowe, a Cambridge-educated engineer, who is currently the technical leader of Mercedes F1. ‘Each channel provides information on a small aspect of performance. It takes us into the detail, but it also enables us to isolate key metrics that help us to improve.’
50%
Flag icon
‘You improve your data set before you begin to improve your final function; what you are doing is ensuring that you have understood what you didn’t initially understand,’ Vowles says. ‘This is important because you must have the right information at the right time in order to deliver the right optimisation, which can further improve and guide the cycle.’
50%
Flag icon
Success is about creating the most effective optimisation loop.
50%
Flag icon
Creativity not guided by a feedback mechanism is little more than white noise. Success is a complex interplay between creativity and measurement, the two operating together, the two sides of the optimisation loop.
52%
Flag icon
We need to run lots of trials, lots of replications, to tease out how far conclusions can be extended from one trial to other contexts.
55%
Flag icon
epiphanies often happen when we are in one of two types of environment.
55%
Flag icon
The first is when we are switching off: having a shower, going for a walk, sipping a cold beer, daydreaming.
55%
Flag icon
We have to take a step back for the ‘associative state’ to emerge.
55%
Flag icon
when we are being sparked by the dissent of others.
55%
Flag icon
the breakthroughs happened at lab meetings, where groups of researchers would gather around a desk to talk through their work. Why here? Because they were forced to respond to challenges and critiques from their fellow researchers.
57%
Flag icon
Creativity, then, has a dual aspect. Insight often requires taking a step back and seeing the big picture. It is about drawing together disparate ideas. It is the art of connection. But to make a creative insight work requires disciplined focus.
58%
Flag icon
We learn not just by being correct, but also by being wrong. It is when we fail that we learn new things, push the boundaries, and become more creative. Nobody had a new insight by regurgitating information, however sophisticated.
63%
Flag icon
Blame too much and people will clam up. Blame too little and they will become sloppy.
73%
Flag icon
The Ford Motor Company, his third venture, changed the world. ‘Failure is simply the opportunity to begin again, this time more intelligently,’
74%
Flag icon
The problem is when setbacks lead not to learning, but to recrimination and defeatism.
« Prev 1