More on this book
Community
Kindle Notes & Highlights
Read between
August 11 - August 11, 2022
Practitioners are not all exposed to the same kind and level of accident risk. This makes it impossible to compare their accident rates and say that some, because of personal characteristics, are more accident-prone than others.
Frank Hermens liked this
There is no evidence that a system approach dilutes personal accountability. In fact, second victims show just how much responsibility practitioners take for things that go wrong.
Frank Hermens liked this
If you hold somebody accountable, that does not have to mean exposing that person to liability or punishment. • You can hold people accountable by letting them tell their story, literally “giving their account.” • Storytelling is a powerful mechanism for others to learn vicariously from trouble.
Frank Hermens liked this
Explore the potential for restorative justice.
To understand failure, you first have to understand your own reactions to failure. Reactions to failure are typically: • Retrospective. They arise from your ability to look back on a sequence of events. • Counterfactual. They lay out what people could or should have done to avoid the outcome that you now know about. • Judgmental. They judge people for not doing what you believe they should have done, or for not paying enough attention to what you now know is important. • Proximal. They focus on those people closest in time and place to (preventing) the mishap.
Frank Hermens liked this
If you look, for a moment, at the psychological research underlying all this, there are actually two ways in which your understanding of a past situation gets influenced: • The hindsight bias. Finding out about an outcome increases the estimate we make about its likelihood. In other words, as a retrospective reviewer who knows the outcome of an event, you exaggerate your own ability to predict and prevent the outcome—while not even being aware of that bias.2 • The outcome bias. Once you know the outcome, it changes your evaluation of decisions that led up to it. If the outcome is
...more
Frank Hermens liked this
How can you avoid hindsight? Figure 2.2 shows two different perspectives on a pathway to failure: • The perspective from the outside and hindsight (typically your perspective). From here you can oversee the entire sequence of events—the triggering conditions, its various twists and turns, the outcome, and the true nature of circumstances surrounding the route to trouble. • The perspective from the inside of the tunnel. This is the point of view of people in the unfolding situation. To them, the outcome was not known (or they would have done something else). They contributed to
...more
Frank Hermens liked this
In order to understand error, you have to examine the larger system in which these people worked. You can divide an operational system into a sharp end and a blunt end: • At the sharp end (for example the train cab, the cockpit, the surgical operating table), people are in direct contact with the safety-critical process. • The blunt end is the organization or set of organizations that both supports and constrains activities at the sharp end (for example, the airline or hospital; equipment vendors and regulators).
Frank Hermens liked this
Out of context I: Micro-matching One of the most popular ways you can assess performance after the fact is to hold it up against a world you now know to be true. This can be: • A procedure or collection of rules: People’s behavior was not in accordance with standard operating procedures that were found to be applicable for that situation afterward. • A set of cues: People missed cues or data that turned out to be critical for understanding the true nature of the situation. • Standards of good practice that people’s behavior falls short of. The problem is that these
...more
Out of context II: Cherry-picking The second broad way in which you can take data out of context, in which you give them meaning from the outside, is by grouping and labeling behavior fragments that, in hindsight, appear to represent a common condition.
Frank Hermens liked this
Out of context III: The shopping bag With the benefit of hindsight, it is easy to sweep together all the evidence that people should have seen. If they had, they would have recognized the situation for what we now know it turned out to be. But that doesn’t mean the evidence presented itself that way to people at the time.
Frank Hermens liked this
There is no “root” cause So what is the cause of the accident? This question is just as bizarre as asking what the cause is of not having an accident. There is no single cause—neither for failure, nor for success. In order to push a well-defended system over the edge (or make it work safely), a large number of contributory factors are necessary and only jointly sufficient. How is it that a mishap gives you so many causes to choose from? Part of the story is the sheer complexity of the systems that we have put together, and that we have protected our systems so well against failure. A lot needs
...more
This highlight has been truncated due to consecutive passage length restrictions.
Frank Hermens liked this
Systems models focus on the whole, not the parts (like the accident models above do). The interesting properties of systems (the ones that give rise to system accidents) can only be studied and understood when you treat them in their entirety. System models build on two fundamental ideas:9 • Emergence: Safety is an emergent property that arises when system components and processes interact with each other and their environment. Safety can be determined only by seeing how parts or processes work together in a larger system; • Control imposes constraints on the degrees of freedom
...more
This highlight has been truncated due to consecutive passage length restrictions.
Frank Hermens liked this
To explain failure, system models do not need a component to break or a human to err. In fact, they do not have to rely on anything “going wrong,” or anything being out of the ordinary. Even though an accident happened, nothing really might have gone wrong—in the sense that nothing that happened was out of the ordinary. Accidents, in other words, are typically the by-product of the normal functioning of the system, not the result of something breaking down or failing inside of that system. In that sense: • There is not much difference (if at all) between studying a successful or a
...more
Frank Hermens liked this
System models do not rely on linear cause–effect relationships to explain how factors interact or relate to one another. This means that they can stay closer to the complexity and goal conflicts behind system success and failure. It also means that they, as models, are more complex.
Frank Hermens liked this
The management of goal conflicts under uncertainty gets pushed down into local operating units—control rooms, cockpits, operating theaters and the like. There the conflicts are to be negotiated and resolved in the form of thousands of little and larger daily decisions and trade-offs. These are no longer decisions and trade-offs made by the organization, but by individual operators or crews. What they accept as risky or normal will shift over time: • as a result of pressures and expectations put on them by the organization; • and as a result of continued success, even under those
...more
Frank Hermens liked this
This gave birth in the 1970s to “man-made disaster theory.” It was really the first to see accidents as the result of a drift into failure, and to focus on the organizational blunt end to explain that drift.11 This theory was a call to understand accidents not as sudden phenomena where energy was not contained, but as phenomena over time, where people and whole organizations subtly changed their idea of what was risky in the first place.
Frank Hermens liked this
Resilience and Safety I versus Safety II If the major risk is success—that is, things going right—instead of failures, then why do we spend the majority of our safety resources on investigating what goes wrong? This is exactly the question Erik Hollnagel raises. Managing safety on the basis of incidents is only one way—and in a sense a very limited way. It focuses, after all, on the few occasional times when things go (almost) wrong, rather than on the many times that things go right. It is a reactive, lagging kind of safety management that might turn into firefighting instead of a proactive,
...more
This highlight has been truncated due to consecutive passage length restrictions.
Frank Hermens liked this
Is safety making sure those few things don’t go wrong, or that as many things as possible go right?
Frank Hermens liked this
Safety has increasingly morphed from operational value into bureaucratic accountability. Those concerned with safety are more and more removed—organizationally, culturally, psychologically—from those who do safety-critical work at the sharp end.
Frank Hermens liked this
Safety as Responsibility Down, Not Accountability Up
Life at the thin edge of the wedge Many industries have become considerably safer over the past century. For sure, there are activities that lie at or near the unsafe system definition (certain forms of surgery, for example, or recreational activities such as base jumping). But many organizations are now ultra-safe or near-zero. They live at the thin edge of the wedge. Creating even more progress on safety there is quite difficult. It is easy to see, however, that doing more of the same does not lead to something different: it will maintain the status quo. In other words, emphasizing
...more
This highlight has been truncated due to consecutive passage length restrictions.
Frank Hermens liked this
Decoy phenomena What can offer some insight into managing risk at the thin edge of the wedge? One is “decoy phenomena.” Barry Turner identified these in the 1970s in his studies of industrial disasters and accidents (he was, as you might recall, the founder of man-made disaster theory). Decoy phenomena are apparent safety issues that take up a lot of the organization’s attention. They may have been singled out as a safety priority. Your organization may have labeled them as one of the “top five” or “top three” risk priorities that needs to be controlled in order to prevent an accident. Such
...more
Frank Hermens liked this
At the thin edge of the wedge, holes in layers of defense and formally reported incidents are no longer the herald of accidents or fatalities. Normal work is.
Frank Hermens liked this
Hard fixes change something fundamental about, or in, the organization. This makes them hard. But it also makes them real fixes.
Frank Hermens liked this
Your organization and ‘human error’ • Your organization is not basically or inherently safe. People have to create safety by putting tools and technologies to use while negotiating multiple system goals at all levels of your organization. • The priorities and preferences that people express through their practice may be a logical reproduction of what the entire organization finds important. • Human error is the inevitable by-product of the pursuit of success in an imperfect, unstable, resource-constrained world. The occasional human contribution to failure occurs because
...more
Frank Hermens liked this
What to think of when investigating ‘human error’ • As far as the people involved were concerned, the outcome was not going to happen. If they knew it was going to, they would have done something else. • Nobody comes to work to do a bad job. This is the local rationality principle. People do what makes sense to them at the time given their focus of attention, their knowledge and their goals (which may well be the organization’s goals, stated or unstated). • Human error is not the cause of failure, but the effect. So ‘human error,’ under whatever label (“loss of situation
...more
Frank Hermens liked this
Reprimanding “Bad Apples” is like peeing in your pants. You did something about the problem and feel relieved. But then it gets cold and u...
This highlight has been truncated due to consecutive passage length restrictions.
Frank Hermens liked this
Creating progress on safety with the new view • To create safety, you don’t need to rid your system of ‘human errors’. Instead, you need to realize how people at all levels in the organization contribute to the creation of safety and risk through goal trade-offs that are legitimate and desirable in their setting. • Rather than trying to reduce “violations,” New View strategies will find out more about the gap between work-as-imagined and work-as-done—why it exists, what keeps it in place and how it relates to priorities among organizational goals (both stated and unstated).
...more
Frank Hermens liked this
Speaking for the dead speaks as loudly for them as it does for everyone who could have been in their shoes. It speaks for the past as much as for the future. Of course, we might throw up our hands and admit defeat; say that we cannot crawl into the skull of a dead man; concede that time is irreversible and that reconstructing somebody’s mindset is impossible. But that does not mean that there is nothing systematic about what people do in interaction with complex systems, about why they do what they do. It does not mean that everything about their mind and the world in which its understanding
...more
Frank Hermens liked this