More on this book
Community
Kindle Notes & Highlights
We can’t enjoy kissing just anyone, but we can relish being right about almost anything.
Granted, Ross’s mistake was particularly awkward. But it was not particularly consequential—not for Meadows, who was gracious about it; not for Ross; not even for Ross’s career. So wasn’t wanting to die something of an extreme reaction? Maybe. But if so, it is an extreme reaction to which we all sometimes succumb.
the fact that whatever damage can arise from erring pales in comparison to the damage that arises from our fear, dislike, and denial of erring. This fear acts as a kind of omnipurpose coagulant, hardening heart and mind, chilling our relationships with other people, and cooling our curiosity about the world.
As error goes from being a hallmark of the lawless mind to our native condition, people cease to be fundamentally perfectible and become fundamentally imperfect.
This was the pivotal insight of the Scientific Revolution: that the advancement of knowledge depends on current theories collapsing in the face of new insights and discoveries. In this model of progress, errors do not lead us away from the truth. Instead, they edge us incrementally toward it.
Early on in this book, I observed that one of the recurring questions about error is whether it is basically eradicable or basically inevitable. As a philosophical matter, this question is important, since (as I suggested earlier) the way we answer it says a lot about how we feel about being wrong. As a practical matter, it’s clear that the answer lies somewhere in between: many kinds of error can and should be curtailed, very few can be done away with entirely, and some we shouldn’t even want to get rid of.
Six Sigma experiences just 3.4 such errors per million opportunities to err, a laudably low failure rate (or, framed positively, a 99.9997 percent success rate). To get a sense of what this means, consider that a company that ships 300,000 packages per year with a 99 percent success rate sends 3,000 packages to the wrong place. If that same company achieved Six Sigma, only a single package would go astray.
Traditionally, many companies evaluate their success based on how well they do on average—whether it takes an average of three days to deliver that package, say, or whether the brake pads you manufacture are an average of three-eighths of an inch thick. But the trouble with averages is that they can conceal many potential lapses and mistakes.
Relying on hard data, committing to open and democratic communication, acknowledging fallibility: these are the central tenets of any system that aims to protect us from error. They are also markedly different from how we normally think—from our often hasty and asymmetric treatment of evidence, from the cloistering effects of insular communities, and from our instinctive recourse to defensiveness and denial. In fact, the whole reason these error-proofing techniques exist is to exert a counterweight on our steady state.