More on this book
Community
Kindle Notes & Highlights
by
Matthew Syed
Read between
January 9 - April 13, 2021
A progressive attitude to failure turns out to be a cornerstone of success for any institution.
we are often so worried about failure that we create vague goals, so that nobody can point the finger when we don’t achieve them.
a closed loop is where failure doesn’t lead to progress because information on errors and weaknesses is misinterpreted or ignored; an open loop does lead to progress because the feedback is rationally acted upon).
at the level of systemic complexity, success can only happen when we admit our mistakes, learn from them, and create a climate where it is, in a certain sense, ‘safe’ to fail.
‘Learn from the mistakes of others. You can’t live long enough to make them all yourself.’
when people don’t interrogate errors, they sometimes don’t even know they have made one
The more we can fail in practice, the more we can learn, enabling us to succeed when it really matters.
Practice is not a substitute for learning from real-world failure, it is complementary to it.
We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.
It is by testing our ideas, subjecting them to failure, that we set the stage for growth.
Feedback, when delayed, is considerably less effective in improving intuitive judgement.
Without access to the ‘error signal’, one could spend years in training or in a profession without improving at all.
Science is not just about a method, then, it is also about a mindset. At its best, it is driven forward by a restless sprit, an intellectual courage, a willingness to face up to failures and to be honest about key data, even when it undermines cherished beliefs.
incentives to improve performance can only have an impact, in many circumstances, if there is a prior understanding of how improvement actually happens.
Type One error: an error of commission.
Type Two error: an error of omission.
this trade-off should not obscure the fact that it is possible to reduce both kinds of error at the same time. That is what progress is ultimately about.
When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs.
How can one learn from failure if one has convinced oneself – through the endlessly subtle means of self-justification, narrative manipulation, and the wider psychological arsenal of dissonance-reduction – that a failure didn’t actually occur?
the most effective cover-ups are perpetrated not by those who are covering their backs, but by those who don’t even realise that they have anything to hide.
If it is intolerable to change your mind, if no conceivable evidence will permit you to admit your mistake, if the threat to ego is so severe that the reframing process has taken on a life of its own, you are effectively in a closed loop.
‘We reach the point of diminishing marginal predictive returns for knowledge disconcertingly quickly.’
The pattern is rarely uncovered unless subjects are willing to make mistakes – that is, to test numbers that violate their belief. Instead most people get stuck in a narrow and wrong hypothesis,
By retrieving, editing and integrating disparate memories, we have imagined an entirely new event.
Cognitive dissonance occurs when mistakes are too threatening to admit to, so they are reframed or ignored. This can be thought of as the internal fear of failure: how we struggle to admit mistakes to ourselves.
the external fear of failure – the fear of being unfairly blamed or punished – which also undermines learning from mistakes.
Evolution as a process is powerful because of its cumulative nature.
Cumulative selection works, then, if there is some form of ‘memory’:
‘the illusion of design’: animals that look as if they were designed by a vast intelligence when they were, in fact, created by a blind process.
The failure of companies in a free market, then, is not a defect of the system, or an unfortunate by-product of competition; rather, it is an indispensable aspect of any evolutionary process.
It is about having the courage of one’s convictions, but also the humility to test early, and to adapt rapidly.
Tinkering, tweaking, learning from practical mistakes: all have speed on their side. Theoretical leaps, while prodigious, are far less frequent.
we often neglect the messy, iterative, bottom-up aspect of this change because it is easy to regard the world, so to speak, in a top-down way.
narrative fallacy. We are so eager to impose patterns upon what we see, so hardwired to provide explanations, that we are capable of ‘explaining’ opposite outcomes with the same cause without noticing the inconsistency.
Success is not just dependent on before-the-event reasoning, it is also about after-the-trigger adaptation.
how can you drive evolution without a clear selection mechanism?
‘It is possible to make significant progress against the biggest problem in the world through the accumulation of a set of small steps, each well thought out, carefully tested, and judiciously implemented.’7
Success is a complex interplay between creativity and measurement, the two operating together, the two sides of the optimisation loop.
Marginal gains is a strategy of local optimisation: it takes you to the summit of the first hill. But once you are there, taking little steps, however well tested, runs out of traction.
Without a problem, without a failure, without a flaw, without a frustration, innovation has nothing to latch on to.
Creativity, then, has a dual aspect. Insight often requires taking a step back and seeing the big picture. It is about drawing together disparate ideas. It is the art of connection. But to make a creative insight work requires disciplined focus.
insight is about the big picture, development is about the small picture.
blame is, in many respects, a subversion of the narrative fallacy: an oversimplification driven by biases in the human brain.
if our first reaction is to assume that the person closest to a mistake has been negligent or malign, then blame will flow freely and the anticipation of blame will cause people to cover up their mistakes. But if our first reaction is to regard error as a learning opportunity, then we will be motivated to investigate what really happened.
we have to engage with the complexity of the world if we are to learn from it; we have to resist the hardwired tendency to blame instantly,
When we are dealing with complexity, blaming without proper analysis is one of the most common as well as one of the most perilous things an organisation can do.
blame undermines the information vital for meaningful adaptation.
It takes real discipline to probe the black box data without prejudging the issue.’fn46 In a sense, blame is a subversion of the narrative fallacy. It is a way of collapsing a complex event into a simple and intuitive explanation: ‘It was his fault!’
When a culture is unfair and opaque, it creates multiple perverse incentives. When a culture is fair and transparent, on the other hand, it bolsters the adaptive process.
The reason that it is commercially profitable for papers to run stories that apportion instant blame is because there is a ready market for them. After all, we prefer easy stories; we all have an inbuilt bias towards simplicity over complexity. These stories are, in effect, mass-printed by-products of the narrative fallacy. In a more progressive culture, this market would be undermined. Such stories would be met with incredulity. Newspapers would have an incentive to provide deeper analysis before apportioning blame.