More on this book
Community
Kindle Notes & Highlights
by
Ozan Varol
chapters. For example, rocket scientists conduct thought experiments where a failure produces no tangible damage. They build in redundancies so the mission doesn’t fail even if a component fails. They use tests to lower the stakes because failures on the ground prevent far more disastrous ones in space.
Parents can take a cue from Sara Blakely. She went from selling fax machines door-to-door to becoming the world’s youngest self-made woman billionaire. She credits her success partly to a question that her father would ask her every week when she was growing up. “What have you failed at this week?” If Sara didn’t have an answer, her father would be disappointed. To her father, failing to try was far more disappointing than failure itself.
WE OFTEN ASSUME that failure has an endpoint. We fail until we succeed and then stop failing to reap the benefits of our newly minted position in the pecking order. But failure isn’t a bug to get out of our system until success arrives. Failure is the feature. If we don’t develop a habit of failing regularly, we court catastrophe. As we’ll see in the next chapter, where failure ends, complacency begins.
9 NOTHING FAILS LIKE SUCCESS How Success Produced the Biggest Disasters in Rocket-Science History
Before the Challenger accident, NASA had successfully launched shuttle missions despite the erosion of the O-rings. Before the Columbia accident, numerous shuttle launches had succeeded despite the shedding of foam. Each success reinforced a belief in the status quo. Each success fostered a damn-the-torpedoes attitude. With each success, what would otherwise be considered unacceptable levels of risk became the new norm.
As Bill Gates says, success is “a lousy teacher” because it “seduces smart people into thinking they can’t lose.”
But here’s the thing: You can do some things wrong and still succeed. The technical term here is dumb luck. A spacecraft with a design flaw can safely land on Mars where the conditions don’t trigger the flaw. A poorly shot soccer ball can end up in the goal if it ricochets off another player. A bad trial strategy can produce a win when the facts and the law are on your side. But success has a way of concealing these blunders. When we’re busy lighting cigars and popping champagne corks, we fail to account for the role that luck played in our triumph.
But when we fail to look in the mirror and recognize that we succeeded despite making a mistake and despite taking an unwise risk, we court catastrophe. The bad decisions and the dangers will continue into the future, and the success we once experienced will someday elude us.
We must treat our work—and ourselves—as permanent works in progress.
As time wore on, NASA began to make compromises on safety and reliability. Its quality-assurance staff was cut by more than two-thirds, from roughly 1,700 in 1970 to 505 in 1986, the year that the Challenger was launched. Marshall Space Flight Center in Alabama—which is responsible for rocket propulsion—was the hardest hit, with a reduction from 615 to 88 staff members. The reductions meant “fewer safety inspections… less careful execution of procedures, less thorough investigation of anomalies, and less documentation of what happened.”25
Routine also brought a standardized set of rules and procedures to NASA, with each flight becoming a straightforward application of those standards. Routine meant sticking to the previously scheduled programming and disregarding anomalies. NASA gradually morphed into a hierarchical organization where compliance with rules and procedures became more important than contribution.
The hierarchy also produced a disconnect between the engineers and the managers. The administrators at NASA abandoned the dirty-hands approach of the Apollo era. The managers were no longer intimately involved with the flight technology, and they eventually lost touch. The culture shifted from one focused on research and development to one that operated more like a business with production pressures.26 The engineers were the ones with the dirty hands, and most of them still believed—despite what...
This highlight has been truncated due to consecutive passage length restrictions.
The moment we pretend an activity is routine is the moment we let our guard down and rest on our laurels. The remedy is to drop the word routine from our vocabulary and treat all our projects—particularly the successful ones—as permanent works in progress.
“Human beings,” social psychologist Daniel Gilbert explains, “are works in progress that mistakenly think they’re finished.”
The five-time world track-and-field champion Maurice Greene didn’t make that mistake and saw himself as a permanent work in progress. Even if you’re a world champion, Greene would caution, you must train like you’re number two.31 When you’re ranked second—or at least you pretend you are—you’re less likely to grow complacent. You’ll rehearse that speech until you know it cold, overprepare for that job interview, and work harder than your competitors.
You can foster this never-complacent mindset by assuming you’re trailing slightly behind and that the villain in your story—
The modern world doesn’t call for finished products. It calls for works in progress, where perpetual improvement wins the game.
As I discussed in an earlier chapter, we can reframe questions to generate better answers by focusing on strategy instead of tactics. Applying this principle, Netflix realized it wasn’t in the DVD-delivery business. That was a tactic. Rather, it was in the movie-delivery business. That was its strategy. Delivering DVDs through the mail was simply one tactic among many others—including streaming media—in service of that strategy. “My greatest fear at Netflix,” Hastings said, “has been that we wouldn’t make the leap from success in DVDs to success in streaming.”41 Hastings saw the writing on the
...more
To prevent complacency, knock yourself off the pedestal once in a while. “You have to disrupt yourself,” Steve Forbes says, “or others will do it for you.”46 If we don’t experience variability in our track record—if we don’t prevent our confidence from inflating after a string of random successes—then a catastrophic failure will do that for us. But catastrophic failures also tend to end your business or your career. “If you’re not humble,” said former world heavyweight champion Mike Tyson, “life will visit humbleness upon you.”
We tend to ignore near misses both in the air traffic control room and in the boardroom. Research shows that near misses masquerade as successes because they don’t affect the ultimate outcome.47 The airplane doesn’t crash, the business doesn’t tank, and the economy remains stable. All’s well that ends well, and no harm, no foul, we tell ourselves and move on with our day.
Near misses lead people to take unwise risks. Rather than urgency, near misses create complacency.
But there’s a problem with this metaphor. A postmortem implies that there must be a dead project, a dead business, or a dead career before we’re moved to action. The idea of death suggests that only catastrophic failures deserve a thorough investigation. But if we wait until disaster strikes to conduct a postmortem, the string of small failures and near misses—the chronic problems that build up slowly over time—go unnoticed.
Leading up to the Columbia and Challenger accidents, there wasn’t one gross misjudgment, one major miscalculation, or one egregious breach of duty. Rather, “a series of seemingly harmless decisions were made that incrementally moved the space agency” to catastrophe, as sociologist Diane Vaughan writes.51 These were small steps, not giant leaps. The story is a common one. Most corporations perish because they ignore the baby steps, the weak signals, the near misses that don’t immediately affect outcomes.
The goal should be to spot these stealth signals before they snowball into something we can’t control. This means that postmortems shouldn’t be reserved for our worst days on the field. They should follow both failure and success.
The next time you’re tempted to start basking in the glory of your success while admiring the scoreboard, stop and pause for a moment. Ask yourself, What went wrong with this success? What role did luck, opportunity, and privilege play? What can I learn from it? If we don’t ask these questions, luck will eventually run its course, and the near misses will catch up with us. This set of questions, as you may have noticed, is no different from the ones we explored in the last chapter on failure. Asking the same questions and following the same process regardless of what happens is one way of
...more
A postmortem can be useful in uncovering and correcting mistakes. But it also has a drawback: When we conduct postmortems after a success, we already know the outcome. We tend to assume good outcomes resulted from good decisions and bad outcomes resulted from bad decisions. It’s hard to find mistakes when we know we succeeded, and it’s hard to avoid the blame game when we know we failed.
In a premortem, we travel forward in time and set up a thought experiment where we assume the project failed. We then step back and ask, “What went wrong?” By vividly visualizing a doomsday scenario, we come up with potential problems and determine how to avoid them. According to research, premortems increase by 30 percent the ability of participants to correctly determine the reasons for a future outcome.63
A premortem works backward from an undesired outcome. It forces you to think about what could go wrong before you act.
When you conduct a premortem and think through what can go wrong, you should assign probabilities to each potential problem.64 If you quantify uncertainty ahead of time—there’s a 50 percent chance that your new product might fail—you’re more likely to recognize the role that luck played in any resulting success. Quantifying uncertainty can also take the sting out of any failure that follows.
The premortems we compile should be easily accessible. At X, these premortems “live on a site where anyone can post something that they’re worried about going wrong in the future,” explains Astro Teller.66
Success only exacerbates this tendency toward conformity. It drives overconfidence in the status quo, which in turn stifles dissent, precisely when dissent is most needed to prevent complacency. “Minority viewpoints are important,” writes Berkeley psychologist Charlan Nemeth, a leading expert on groupthink, “not because they tend to prevail but because they stimulate divergent attention and thought.”70 Even when minority opinions are wrong, “they contribute to the detection of novel solutions and decisions that, on balance, are qualitatively better.” In other words, dissenters force us to look
...more
The causes of failure in a complex system—whether it’s a rocket or a business—are usually multiple. Numerous factors, including technical, human, and environmental, might combine to produce the failure. Remedying only the first-order causes leaves the second- and third-order causes intact. These are the deeper causes lurking beneath the surface. They make the first-order causes happen and may lead to them again.
The deeper causes of the Challenger accident were hidden in NASA’s dark underbelly, as unearthed by Diane Vaughan in her decisive account of the events. She explains that, contrary to the Rogers Commission’s conclusions, the Challenger accident happened precisely because the managers did their jobs. They were following the rules—not violating them. Vaughan uses the term “normalization of deviance” to describe this pathology. The prevailing culture at NASA had normalized flying with unacceptable risks. “The cultural understandings, rules, procedures, and norms that always had worked in the past
...more
But if we leave the deeper causes unaddressed, the cancer will keep coming back. This is why we heard, in astronaut Sally Ride’s memorable words, the echoes of Challenger in the Columbia accident. As the only person to serve on the investigation boards for both accidents, Ride was uniquely qualified to draw this connection. The technical flaws in the two accidents were different, but the cultural flaws were similar. The deeper causes of the Challenger tragedy had remained unaddressed, even after the technical flaws were fixed and the key decision makers were replaced.
In each case, we confuse a symptom with a cause and leave the deeper causes intact. Painkillers won’t cure our back pain; the source remains. You’re losing market share not because of your competitors but because of your own business policies. Eliminating cartels won’t solve the demand side of the drug problem, and eradicating terrorists won’t prevent new ones from cropping up.
The study found no tangible difference in accident rates between the ABS-equipped cars and the remainder. But one difference was statistically significant: driving behavior. The drivers of the ABS-equipped cars became far more reckless. They tailgated more often. Their turns were sharper. They drove faster. They switched lanes dangerously. They were involved in more near misses. Paradoxically, a measure introduced to boost safety promoted unsafe driving behavior.77
The psychologist Gerald Wilde calls this phenomenon risk homeostasis.75 The phrase is fancy, but the idea is simple. Measures intended to decrease risk sometimes backfire. Humans compensate for the reduced risk in one area by increasing risk in another.
Safety measures also backfired in the Challenger mission. The managers believed that O-rings had a sufficient safety margin “to enable them to tolerate three times the worst erosion observed up to that time.”78 What’s more, there was a fail-safe in place. Even if the primary O-ring failed, the officials assumed the secondary O-ring would seal and pick up the slack.79 The existence of these safety measures boosted a sense of invincibility and led to catastrophe when both the primary and the secondary O-rings failed during launch. These rocket scientists were like German cabbies in ABS-equipped
...more
This paradox doesn’t mean that we stop fastening our seat belts, buy ancient cars that don’t come with ABS, or take up jaywalking. Instead, pretend the crosswalk isn’t marked, and walk accordingly. Assume the secondary O-ring or the ABS brakes won’t prevent the accident. Keep your head out of the tackle, even if you’re wearing a helmet. Act as if you didn’t receive an extension on that project deadline. The safety net may be there to catch you if you fall, but you’re better off pretending it doesn’t exist.
In every annual letter to Amazon shareholders, Jeff Bezos includes the same cryptic line: “It remains Day 1.” After repeating this mantra for a few decades, Bezos was asked what Day 2 would look like. He replied, “Day 2 is stasis. Followed by irrelevance. Followed by excruciating, painful decline. Followed by death. And that is why it is always Day 1.”11 The rocket-science mindset requires remaining in Day 1 and repeatedly introducing color into the monochromatic world. We must keep devising thought experiments, taking moonshots, proving ourselves wrong, dancing with uncertainty, reframing
...more
“However sweet these laid-up stores, however convenient this dwelling, we cannot remain here,” Walt Whitman wrote. “However shelter’d this port and however calm these waters, we must not anchor here.”12
In the end, there’s no hidden playbook. No secret sauce. The power is there for the taking. Once you learn how to think like a rocket scientist—and nurture that thinking in the long term—you can turn the unimaginable into the imaginable, mold science fic...
This highlight has been truncated due to consecutive passage length restrictions.