Think Like a Rocket Scientist: Simple Strategies You Can Use to Make Giant Leaps in Work and Life
Rate it:
Kindle Notes & Highlights
10%
Flag icon
Life is taxing enough without uncertainty, so we eliminate the uncertainty by ignoring the anomaly. We convince ourselves the anomaly must be an extreme outlier or a measurement error, so we pretend it doesn’t exist.
12%
Flag icon
In addition to including redundancies, rocket scientists address uncertainty by building in margins of safety. For example, they build spacecraft stronger than what appears necessary or make thermal insulation thicker than required.
14%
Flag icon
The status quo is a super magnet. People are biased against the way things could be and find comfort in the way things are.
15%
Flag icon
Roughly 70 percent of the time, the participants changed their correct answers and went along with the wrong answers given by the rest of the group.
20%
Flag icon
Every time you introduce complexity to a system, you’re giving it one more aspect that can fail.
23%
Flag icon
At best, we pay lip service to curiosity but end up discouraging it in practice.
32%
Flag icon
We have to generate ideas first before we can begin evaluating and eliminating them. If we cut the accumulation process short—if we immediately start thinking about consequences—we run the risk of hampering originality.
36%
Flag icon
Backcasting reorients you toward the path. If you want to climb a mountain, you’ll imagine training with your backpack on, hiking at high altitudes to get used to the low-oxygen environment, climbing stairs to build up muscles, and running to improve endurance.
36%
Flag icon
With this reorientation, you also condition yourself to derive intrinsic value from the process rather than chasing elusive outcomes.
37%
Flag icon
If we restrict ourselves to what’s possible given what we have, we’ll never reach escape velocity and create a future worth getting excited about. In the end, all moonshots are impossible. Until you decide to go.
45%
Flag icon
We quickly jump from “This sounds right to me” to “This is true.” Confirming our theories feels good. We get a hit of dopamine every time we’re proven right.
47%
Flag icon
From a scientific perspective, opinions present several problems. Opinions are sticky. Once we form an opinion—our own very clever idea—we tend to fall in love with it, particularly when we declare it in public through an actual or a virtual megaphone.
48%
Flag icon
To make sure you don’t fall in love with a single hypothesis, generate several.
48%
Flag icon
Ideally, the hypotheses you spin should conflict with each other.
50%
Flag icon
When our focus shifts from proving ourselves right to proving ourselves wrong, we seek different inputs, we combat deeply entrenched biases, and we open ourselves up to competing facts and arguments.
52%
Flag icon
Once you’ve stress-tested your ideas by trying to prove yourself wrong, it’s now time to collide those ideas with reality in tests and experiments.
52%
Flag icon
We don’t rise to the level of our expectations. We fall to the level of our training.
52%
Flag icon
We conduct tests—not to prove ourselves wrong, but to confirm what we believe is true. We tweak the testing conditions or interpret ambiguous outcomes to confirm our preconceptions.
52%
Flag icon
In a proper test, the goal isn’t to discover everything that can go right. Rather, the goal is to discover everything that can go wrong and to find the breaking point.
57%
Flag icon
Great comedians also think like rocket scientists and test their material before an actual audience to observe their reaction.
61%
Flag icon
The goal isn’t to fail fast. It’s to learn fast. We should be celebrating the lessons from failure—not failure itself.
63%
Flag icon
The goal, then, is to focus on the variables you can control—the inputs—instead of the outputs.
65%
Flag icon
In one study, cardiac surgeons who observed their colleagues’ blunders got significantly better at performing the procedure.
66%
Flag icon
The worst-performing teams—those that were in most need of improvement—were also the least likely to report errors. And if errors aren’t reported, the team can’t improve.
71%
Flag icon
When we succeed, we stop pushing boundaries. Our comfort sets a ceiling, with our frontiers shrinking rather than extending. Corporate executives are rarely punished for deviating from a historically successful strategy.
71%
Flag icon
As a result, instead of risking something new, we maintain the same “proven” formula that led to our success. This tactic works well—until it doesn’t.
71%
Flag icon
Most corporations perish because they ignore the baby steps, the weak signals, the near misses that don’t immediately affect outcomes.
72%
Flag icon
What went wrong with this success? What role did luck, opportunity, and privilege play? What can I learn from it? If we don’t ask these questions, luck will eventually run its course, and the near misses will catch up with us.
73%
Flag icon
In a premortem, we travel forward in time and set up a thought experiment where we assume the project failed. We then step back and ask, “What went wrong?” By vividly visualizing a doomsday scenario, we come up with potential problems and determine how to avoid them.
73%
Flag icon
According to research, premortems increase by 30 percent the ability of participants to correctly determine the reasons for a future outcome.
73%
Flag icon
If we know there was uncertainty attached to a previous decision, it becomes easier to challenge it.
73%
Flag icon
It’s no wonder that groupthink pops up even in organizations whose lifeblood is creativity. Faced with potential backlash, we censor ourselves rather than go against the grain.