More on this book
Community
Kindle Notes & Highlights
by
David Robson
Read between
April 1 - July 27, 2019
‘These extremely high-skilled and knowledgeable professionals were getting sucked into these crazy things, saying “this is intelligent, this is rational”, then wasting an incredible amount of time.’*
Having worked at the BBC while researching this book, it occurs to me that deciding to create a three-series sitcom about your own organisational failings – rather than fixing them – is perhaps the definition of functional stupidity.
‘strategic ignorance’ is now well studied in psychological experiments where participants must compete for money: often participants choose not to know how their decisions affect the other players.
if we mine the same vein day after day, we may begin to pay less attention to the nuances and details. The German language, incidentally, has a word for this: the Fachidiot, a one-track specialist who takes a single-minded, inflexible approach to a multifaceted problem.
entrepreneurs often look to explain their failings with external factors (‘my idea was before its time’) rather than considering the errors in their own performance, and how it might be adapted in the future.
‘Instead of getting better – which this “fail forward” idea would suggest – they actually get worse over time,’ Spicer said. ‘Because of these self-serving biases, they just go and start a new venture and make exactly the same mistakes over and over again . . . and they actually see this as a virtue.’
(It is the reason we are wiser when advising a friend about a relationship problem, even if we struggle to see the solution to our own troubles.)
‘when playing Russian roulette, the fact that the first shot got off safely is little comfort for the next’.13
outcome bias,
we passively accept the most salient detail from an event (what actually happened) and don’t stop to think about what might have been, had the initial circumstances been slightly different.
Unsurprisingly, the complete failure is judged most harshly, but most of the participants were happy to ignore the design flaw in the ‘near-miss’ scenario, and instead praised Chris’s leadership skills.
‘Multiple near-misses preceded and foreshadowed every disaster and business crisis we studied,’ Tinsley’s team concluded in an article for the Harvard Business Review in 2011.15
Subsequent analyses revealed 57 previous instances in which the Concorde tyre had burst on the runway, and in one case the damage was very nearly the same as for Flight 4590 – except, through sheer good luck, the leaking fuel had failed to ignite.
Tinsley has found that people are far more likely to note and report near misses when safety is emphasised as part of the overall culture, in its mission statements – sometimes with as much as a five-fold increase in reporting.21
Those told that ‘NASA, as a highly visible organization, must operate in a high-safety, safety-first environment’, in contrast, successfully identified the latent danger.
when you experience a near miss, your risk tolerance will increase and you won’t be aware of it.’
lack of reflection, engagement and critical thinking
But it’s rarely what happened on the spot that caused the accident. It’s often what happened years before.’
In addition to studying disasters, Roberts’ team has also examined the common structures and behaviours of ‘high-reliability organisations’ such as nuclear power plants, aircraft carriers, and air traffic control systems that operate with enormous uncertainty and potential for hazard, yet somehow achieve extremely low failure rates.
emphasise the need for reflection, questioning, and the consideration of long-term consequences – including, for example, policies that give employees the ‘licence to think’.
Preoccupation with failure: The organisation complacent with success, and workers assume ‘each day will be a bad day’. The organisation rewards employees for self-reporting errors. Reluctance to simplify interpretations: Employees are rewarded for questioning assumptions and for being sceptical of received wisdom. At Deepwater Horizon, for instance, more engineers and managers may have raised concerns about the poor quality of the cement and asked for further tests. Sensitivity to operations: Team members continue to communicate and interact, to update their understanding of the situation at
...more
This highlight has been truncated due to consecutive passage length restrictions.
SUBSAFE specifically instructs officers to experience ‘chronic uneasiness’, summarised in the saying ‘trust, but verify’, and in more than five decades since, they haven’t lost a single submarine using the system.
empowering junior staff to question assumptions and to be more critical of the evidence presented to them, and encouraging senior staff to actively engage the opinions of those beneath them so that everyone is accountable to everyone else.
near miss.
‘If I had more time and resources, would I make the same decisions?’
‘pause and learn’,
near-miss reporting systems;
pre-mortems and post-mortems, and appointing a devil’s advocate whose role is to question decisions and look for flaws in their logic.
it leads to slightly dissatisfied people but better-quality decisions.’
engineer, it pays to humbly recognise your limits and the possibility of failure, take account of ambiguity and uncertainty, remain curious and open to new information, recognise the potential to grow from errors, and actively question everything.
Although one-third of INPO’s inspectors are permanent staff, the majority are seconded from other power plants, leading to a greater sharing of knowledge between organisations, and the regular input of an outside perspective in each company.
Since INPO began operating, US generators have seen a tenfold reduction in the number of worker accidents.35
Study after study has shown that encouraging people to define their own problems, explore different perspectives, imagine alternative outcomes to events, and identify erroneous arguments can boost their overall capacity to learn new material while also encouraging a wiser way of reasoning.
The World Economic Forum has listed increasing political polarisation and the spread of misinformation in ‘digital wildfires’4 as two of the greatest threats facing us today – comparable to terrorism and cyber warfare.
This may sound like wishful thinking, but remember that American presidents who scored higher on scales of open-mindedness and perspective taking were far more likely to find peaceful solutions to conflict. It’s not unreasonable to ask whether, given this research, we should be actively demanding those qualities in our leaders, in addition to more obvious measures of academic achievement and professional success.
we often spend huge amounts of time trying to boost our self-esteem and confidence. ‘But I think that if more people had some humility about what they know and don’t know, that would go a tremendous distance to improving life for everyone.’