I feel like I need a book club for this one - I was left feeling like some of my favorite tools had been debunked but in their place I had gotten very little. I should note that I also use systems thinking for evaluating my systems - in 'wai' or accident. But this was such an indictment of any linear thinking.
“Drifting into failure is a gradual incremental decline into disaster driven by environmental pressure, unruly technology, and social processes that normalize growing risk. No organization is exempt from drifting into failure. The reason is that the routes to failure trace through the structures, processes, and tasks that are necessary to make an organization successful. Failure doesn't come from the occasional abnormal dysfunction or breakdown of the structures and processes and tasks. It is the inevitable byproduct of their normal functioning. The same characteristics that guarantee the fulfillment of the organization’s mandate will turn out to be responsible for undermining that very mandate. Drifting into failure is a slow, incremental process. An organization using all of its resources in pursuit of its mandate…gradually borrows more and more from the margins that once buffered it from assumed boundaries of failure. The very pursuit of the mandate over time and under the pressure of various environmental factors like competition and scarcity dictates that it does this borrowing, that it does things more efficiently, that it does more with less, perhaps takes greater risks. Thus it is the very pursuit of the mandate that creates the conditions for its eventual collapse. The bright side inexorably brews the dark side, given enough time, enough uncertainty, enough pressure… This reading of how organizations fail contradicts traditional and some would say simplistic ideas about how component failures are necessary to explain accidents. The traditional model would claim that for accidents to happen something must break, something must give, something must malfunction. This may be a component part or a person. But in stories of drift into failure organizations fail precisely because they're doing well. On a narrow range of performance criteria; that is the ones they get rewarded on in their current political or economic or commercial configuration. In drift into failure accidents can happen without anything breaking, Without anybody erring, Without anybody violating the rules they consider relevant. I believe that our conceptual apparatus for understanding drift into failure is not yet well developed. In fact most of our understanding is held hostage by Newtonian-Cartesian vision of how the world works. This makes particular and often entirely taken for granted assumptions about decomposability and the relationships between cause and effect. These assumptions may be appropriate for understanding simpler systems. But they are becoming increasingly inadequate for examining how formal, bureaucratically organized risk management in a tightly interconnected complex world contributes to the incubation of failure. The growth of complexity in society has outpaced our understanding of how complex systems work and fail. Our technologies have gotten ahead of our theories. We are able to build things whose properties we understand in isolation. But in competitive regulation societies their connections proliferate, their interactions and dependencies multiply, their complexities mushroom.”
“In courts we argue that people could reasonably have foreseen harm and that harm was indeed caused by their action or Omission. We couple assessments of the extent of negligence or the depth of the moral depravity of people's decisions with the size of the outcome. If the outcome was worse… Then the actions that led up to it must be really really bad.”
“Complexity is the defining characteristic of society and many of its technologies today. And yet simplicity and linearity remain the defining characteristics of the stories and the theories that we use to explain bad events that emerged from this complexity. Our language and logic remain imprisoned in the Space of linear interactions and component failures that was once defined by Newton and Descartes.”
“...we may be overconfident that we can foresee the effects because we apply Newtonian folk science to our understanding of how the world works. With this we make risk assessments and calculate failure probabilities. But in complex systems we can never predict results; we can only predict probabilities.”
“Direct unmediated or objective knowledge of how a whole complex system works is impossible to get. Knowledge is merely an imperfect tool used by an intelligent agent to help it achieve its personal goals. An intelligent agent not only doesn't need an objective representation of reality in its head or in its computer model, it could never achieve one. The world is too large, too uncertain, too unknowable.”
“Thus the harmful outcome is not reducible to the actual decisions by individuals in the system; it is a routine byproduct of the characteristics of the complex system itself.”
“Drift occurs in small steps.”
“Post-structuralism stresses the relationship between the reader and the text as the engine of truth. Reading in post-structuralism is not seen as the passive consumption of what is already there provided by somebody who already possessed the truth and is just passing it on. Rather reading is a creative act, a constitutive act in which readers generate meanings out of their own experience and history with the text and with what it points to. As post-structuralism sees it, the author and reader aren't very different at all. Even authors write within a context of other texts… in which choices are made about what to look at and what to not, choices that are governed by the author's own background and institutional arrangements and expectations. The author is no longer a voice of truth in this. Text or any other language available about events or incidents has thereby lost its stability. Nietzsche for example would have been very distrustful of the suggestion that while everybody has different interpretations it is still the same text. No he and post-structuralism in general does not believe that it is a single world that we are all interpreting differently and that we could in principle reach agreement on when we put all the different pictures together. More perspectives don't mean a greater representation of some underlying truth. How does this work in a complex system? More perspectives typically mean more contradictions. Of course there might be some partial overlap but different perspectives on an event will create different stories that are going to be contradictory guaranteed. The reason says Complexity Science is that complex systems can never be wholly understood or exhaustively described.”
“System thinking is about relationships, not parts… Systems thinking is about accidents that are more than the sum of their broken parts. It is about understanding how accidents can happen when no parts are broken or no parts are seen as broken. Which produces a second question perhaps even more fascinating: Why did none of these deficiencies which are now so obvious strike anybody as deficiencies at the time? Or if somebody did note the missed deficiencies then why was that voice apparently not sufficiently persuasive? If things really were as bad as we can make them look post-mortem then why was everybody including the regulator tasked with public money to protect safety happy with what was going on?”
“Jens Rasmussen suggested that work in complex systems is bounded by three types of constraints: there's an economic boundary Beyond which the system cannot sustain itself financially. Then there's a workload boundary beyond which people and Technologies can no longer perform the tasks they are supposed to be doing. And there's a safety boundary Beyond which the system will functionally fail.”
“Few regulators will ever claim that they have adequate time and personnel resources to fully carry out their mandates. Yet the fact that resource pressure is normal doesn't mean that it has no consequences.”
“As a system is taken into use it learns. And as it learns it adapts. A critical ingredient of this learning is the apparent insensitivity to mounting evidence that from the perspective of retrospective outsider could have shown how bad judgments and decisions actually are. This is how it looks from the position of the retrospective outsider. The retrospective outsider sees a failure of foresight. From the inside however the abnormal is pretty normal and making trade-offs in the direction of greater efficiency is nothing unusual. In making these trade-offs however there's a feedback imbalance. Information on whether a decision is cost effective or efficient can be relatively easy to get… how much is or was borrowed from safety to achieve that goal however is much more difficult to quantify and compare. If it was followed by a safe [outcome] apparently it must have been a safe decision.”
“Empirical success in other words is no proof of safety. Past success does not guarantee future safety. Borrowing more and more from safety may go well for a while but we never know when we're going to hit.”
“There's something complex and organic about the [system], something ecological that is lost when we model them as a layer of defense with a hole in it when we see them as a mere deficiency or a latent failure. When we see systems instead as internally plastic, as flexible, as organic their functioning is controlled by dynamic relationships and ecological adaptation rather than by rigid mechanical structures.”
“Self transcendence - the ability to reach out beyond currently known boundaries and learn and develop and perhaps improve.”
“The banality of accidents thesis... Incidents do not precede accidents - normal work does.” --> accidents are a normal byproduct of everyday activities and decision-making within organizations, rather than being caused by isolated failures or mistakes.
“Safety culture for example breaks culture down into what employees experience or believe… as well as the presence or absence of particular components of [safety investments]. It measures those, adds them up, and gets an outcome for safety culture. Together these components are assumed to constitute a culture. Yet it is never made clear how the parts become the whole.”
[simplistic thought says] “There is only one true story of what happened. Not just because there is only one pre-existing order to be discovered but also because the knowledge of that story is a mental representation or mirror of that border. The truest story is the one where the gap between external events and internal representation is the smallest. The true story is the one in which there is no gap at all.”
“Zvi Lanir used the term Fundamental Surprise to capture the sudden revelation that one’s perception of the world is entirely incompatible with what turned out to be the case.”
“ According to the Sequence of Events idea, events preceding the accident happen linearly in a fixed order and the accident itself is the last event in the sequence.”
“The Swiss cheese model got its name from the image of multiple layers of Defense or cheese with holes in them. Only a particular relationship between those holes however (when they all line up) well allow the hazard to reach the object that was supposed to be protected.”
High reliability theory: Leadership safety initiatives (others will never be enticed to find safety more compelling than their leadership), need for redundancy (duplication or overlap), decentralization culture and continuity, organizational learning
“They trust the procedures to keep them apprised of the developing problems and the belief that these procedures focus on the most important events and ignore the least significant ones. Success narrows perceptions. It changes attitudes. It reinforces a single way of doing business. It breeds overconfidence and the inadequacy of current practices. And it reduces the acceptance of a opposing points of view.”
“Thus even though experts may be well educated and motivated even to them a warning of an incomprehensible and unimaginable event can I be seen because it cannot be believed. This place has severe limits on the rationality that can be brought to bear in any decision-making situation. Seeing what one believes and not seeing that for which one has no beliefs are essential to sense making as Wagz [?] says.”
“Mechanistic thinking about failures, that is the Newtonian Cartesian approach, means going down and in. Understanding where things went wrong comes from breaking open the system and Diving down, finding the parts, identifying which ones are broken. This approach is taken even if the parts are located in different areas of the system… In contrast systems thinking about failures means going up and out. Understanding comes from seeing how the system is configured in a larger network of other systems.”
“It is known as recursion or fractals when speaking in geometric terms. Where stochastically self-similar processes operate at different relative sizes or orders of magnitude. Such patterns recognizable at different levels of resolution when studying complex systems are one way of finding some order, some pattern in how complexity works, some way of mapping what goes on there.”
“And then proclaiming that such an organization or crime Syndicate has been dealt with by chopping off its head reduces the problem to what it isn't. Chopping off the head of a complex system doesn't work. It doesn't work at all. It is the logically impossible. After all, its executive intelligence, its central nervous system is distributed throughout the entire system. It is the system that is complex and smart and adaptive, not some omniscient government that can be dealt with in isolation.”
“Complex is not the same as complicated. A complicated system can have a huge number of parts and interactions between parts but it is in principle exhaustively describable.”
“If accidents are emerging properties then the accident proneness of the organization cannot be reduced to the accident proneness of the people who make up the organization. You can suffer an organizational accident in an organization where people themselves have no little accidents or incidents, in which everything looks normal, and everybody is abiding by their rules.”
“Indeed we can ask whether our analyzes and theories of past accidents and disasters tell us anything useful at all for designing institutions with better future performance or whether we are merely left with the observation that complex organizations faced with turbulent environments will repeatedly fail us in unpredictable ways and the only practical advice to risk managers is to stay fully alert to this possibility.”
“Complexity theory has no answers as to who is accountable for drift into failure. Just as the Newtonian view has only oversimplified, extremely limited, and probably unjust answers. What complexity theory allows us to do however is to dispense with the notion that there are easy answers supposedly within reach of the one with the one best method or most objective viewpoint. Complexity allows us to invite more voices into the conversation and to celebrate the diversity of their contributions. Truth, if there is such a concept, lies in diversity, not singularity.”