Think Like a Rocket Scientist: Simple Strategies You Can Use to Make Giant Leaps in Work and Life
Rate it:
61%
Flag icon
Thomas Edison recounted the story of a conversation with an associate who lamented that after thousands of experiments, he and Edison had failed to discover anything. “I cheerily assured him that we had learned something,” Edison recalled. “For we had learned for a certainty that the thing couldn’t be done that way, and that we would have to try some other way.”
63%
Flag icon
Breakthroughs are often evolutionary, not revolutionary. Take a look at any scientific discovery, and you’ll find there is no magical it. No single aha moment. Science weaves from failure to failure, with each version better than the one that came before. From a scientific perspective, failure isn’t a roadblock. It’s a portal to progress. We embodied this mindset as children. When we learned how to walk, we didn’t get it right on the first try. No one told us, “You’d better think hard about how you take that very first step because you get one step and that’s it.” We repeatedly fell. With each ...more
65%
Flag icon
But the results were the exact opposite. Better teams were making more mistakes, not less. What could explain this counterintuitive outcome? Edmondson decided to dig deeper, sending a research assistant into the wild to observe the teams on the hospital floor. The assistant discovered that better teams weren’t making more mistakes. Instead, they were simply reporting more mistakes. The teams that had a climate of openness—where the staff felt safe to discuss mistakes—performed better because employees were more willing to share failures and actively work to reduce them. Edmondson refers to ...more
65%
Flag icon
Research shows that psychological safety stimulates innovation.61 When people feel free to speak up, ask provocative questions, and air half-formed thoughts, it becomes easier to challenge the status quo. Psychological safety also increases team learning.62 In psychologically safe environments, employees challenge questionable calls by superiors instead of obediently complying with the commands.
68%
Flag icon
The commission determined that the explosion resulted from a failure of the O-rings. At a commission hearing, Richard Feynman stunned television audiences by dropping an O-ring into ice water. The O-ring visibly lost its ability to seal in temperatures similar to those prevailing at the time of Challenger’s launch. The recurring problems with the O-rings had been described in NASA documents as an “acceptable risk,” the standard way of doing business. As one flight after another was completed despite dangerous levels of O-ring damage, NASA began to develop institutional tunnel vision. “Since ...more
69%
Flag icon
Science, as George Bernard Shaw writes, “becomes dangerous only when it imagines that it has reached its goal.”12 Before the Challenger accident, NASA had successfully launched shuttle missions despite the erosion of the O-rings. Before the Columbia accident, numerous shuttle launches had succeeded despite the shedding of foam. Each success reinforced a belief in the status quo. Each success fostered a damn-the-torpedoes attitude. With each success, what would otherwise be considered unacceptable levels of risk became the new norm.
69%
Flag icon
Success is the wolf in sheep’s clothing. It drives a wedge between appearance and reality. When we succeed, we believe everything went according to plan. We ignore the warning signs and the necessity for change. With each success, we grow more confident and up the ante. But just because you’re on a hot streak doesn’t mean you’ll beat the house. As Bill Gates says, success is “a lousy teacher” because it “seduces smart people into thinking they can’t lose.”13 Research supports this intuition.14 In one representative study, financial analysts who made better-than-average predictions over four ...more
69%
Flag icon
But when we fail to look in the mirror and recognize that we succeeded despite making a mistake and despite taking an unwise risk, we court catastrophe. The bad decisions and the dangers will continue into the future, and the success we once experienced will someday elude us. This is why child prodigies unravel. This is why the housing market, believed to be the bedrock of the American economy, crumbled. This is why Kodak, Blockbuster, and Polaroid flamed out. In each case, the unsinkable sinks, the uncrashable crashes, and the indestructible self-destructs—because we assume their previous ...more
70%
Flag icon
NASA didn’t lose a single crew member in space during the Apollo, Mercury, and Gemini missions, when human spaceflight was viewed as a risky work in progress. The only fatalities during those early years occurred during a launch rehearsal test on the ground, when the Apollo 1 spacecraft caught fire. It was only after human spaceflight was viewed as routine that we lost a NASA crew during flight. “We’ve grown used to the idea of space,” President Reagan said after the Challenger disaster, “and perhaps we forget that we’ve only just begun.”
70%
Flag icon
Netflix started out by disrupting the traditional video rental model by shipping DVDs through the mail. But even as the company began to corner that market, its cofounder and CEO, Reed Hastings, remained vigilant.40 As I discussed in an earlier chapter, we can reframe questions to generate better answers by focusing on strategy instead of tactics. Applying this principle, Netflix realized it wasn’t in the DVD-delivery business. That was a tactic. Rather, it was in the movie-delivery business. That was its strategy. Delivering DVDs through the mail was simply one tactic among many ...more
71%
Flag icon
When we succeed, we stop pushing boundaries. Our comfort sets a ceiling, with our frontiers shrinking rather than extending. Corporate executives are rarely punished for deviating from a historically successful strategy. But the risk of punishment is far greater if an executive abandons a successful strategy to pursue one that ends up failing. As a result, instead of risking something new, we maintain the same “proven” formula that led to our success. This tactic works well—until it doesn’t.
71%
Flag icon
Near misses lead people to take unwise risks. Rather than urgency, near misses create complacency. In studies, people who have information about near misses make riskier decisions than those with no information about them.49 Even though the actual risk of failure remains the same after a near miss, our perception of the risk decreases.50 At NASA, the management interpreted each near miss not as a potential problem, but as data that confirmed its belief that O-ring damage or foam shedding weren’t risk factors and wouldn’t compromise the mission. The managers had a perfect string of successes. ...more
71%
Flag icon
But there’s a problem with this metaphor. A postmortem implies that there must be a dead project, a dead business, or a dead career before we’re moved to action. The idea of death suggests that only catastrophic failures deserve a thorough investigation. But if we wait until disaster strikes to conduct a postmortem, the string of small failures and near misses—the chronic problems that build up slowly over time—go unnoticed.
73%
Flag icon
By vividly visualizing a doomsday scenario, we come up with potential problems and determine how to avoid them. According to research, premortems increase by 30 percent the ability of participants to correctly determine the reasons for a future outcome.63 If you’re a business leader, a premortem might focus on a product you’re currently designing. You would assume the product failed and then work backward to determine the potential reasons. Perhaps you didn’t test the product properly or it wasn’t the right fit for your market. If you’re a job candidate, a premortem might involve an interview. ...more
This highlight has been truncated due to consecutive passage length restrictions.
73%
Flag icon
Tragically, for the Challenger and Columbia, these dissenting voices were ignored.71 The burden shifted to the engineers to prove their safety concerns with hard, quantifiable data. Instead of requiring proof that the spacecraft was safe to launch (Challenger) or safe to land (Columbia), engineers were required to prove that it wasn’t safe. Roger Tetrault, a member of the Columbia Accident Investigation Board, explained the management’s attitude toward the engineers in the following way: “Prove to me that it’s wrong, and if you prove to me that there is something wrong, I’ll go look at it.”72 ...more
74%
Flag icon
The psychologist Gerald Wilde calls this phenomenon risk homeostasis.75 The phrase is fancy, but the idea is simple. Measures intended to decrease risk sometimes backfire. Humans compensate for the reduced risk in one area by increasing risk in another. Consider, for example, a three-year study conducted in Munich.76 One portion of a taxicab fleet was equipped with an antilock brake system (ABS). The remainder of the cabs had traditional, non-ABS brakes. The cars were identical in all other respects. They drove at the same time of day, the same days of the week, and in the same weather ...more
This highlight has been truncated due to consecutive passage length restrictions.
76%
Flag icon
The journey cannot end once the mission is accomplished. That’s when the real work begins. When success brings complacency—when we tell ourselves that now that we’ve discovered the New World, there’s no reason to return—we become a shadow of our former selves. In every annual letter to Amazon shareholders, Jeff Bezos includes the same cryptic line: “It remains Day 1.” After repeating this mantra for a few decades, Bezos was asked what Day 2 would look like. He replied, “Day 2 is stasis. Followed by irrelevance. Followed by excruciating, painful decline. Followed by death. And that is why it is ...more
This highlight has been truncated due to consecutive passage length restrictions.
« Prev 1 2 Next »