More on this book
Community
Kindle Notes & Highlights
by
Ozan Varol
Read between
October 2 - November 12, 2022
One way to shock your brain and generate wacky ideas is to ask, What would a science-fiction solution look like? Fiction transports us to a reality far different from our own—without the need to ever leave our couch.
Musk certainly does his part to boost that image. Every time he opens his mouth, he gives you a reason to doubt him. Aerospace consultant Jim Cantrell, recalling their initial encounters, thought Musk was out of his mind.
Kennedy’s promise of the Moon in less than a decade? Impossible. Marie Curie’s attempts to break gender barriers in science? Preposterous. Nikola Tesla’s vision of a wireless system for transmitting information? Science fiction.
“Most highly successful people have been really right about the future at least once at a time when people thought they were wrong,” Sam Altman writes. “If not, they would have faced much more competition.”57 Today’s laughingstock is tomorrow’s visionary. You’ll be the one laughing when you cross the finish line.
When X first starts spinning ideas for moonshots, divergent thinking predominates. “At the very early stages of idea formation,” Felten told me, “there’s tremendous value to science-fiction thinking. If it doesn’t break the laws of physics, the idea is potentially fair game.”
X’s goal is to make moonshot thinking the new norm. To this end, the company aims to consistently shock the collective mental muscles of the team.
Backcasting enabled humankind’s first actual moonshot. NASA began with the result of landing humans on the Moon and worked backward to determine the steps necessary to get there: Get a rocket off the ground first, then put a person in orbit around Earth, then do a spacewalk, then rendezvous and dock with a target vehicle in Earth orbit, and then send a manned spacecraft to the Moon to circle around it and come back.
If you want to write a book, you’ll imagine sitting in front of your computer every single day for two years putting one awkward word after the next, writing one ghastly draft chapter after another, polishing, tweaking, and retweaking—even if you don’t feel like it—with no recognition or accolades.
To counter the sunk-cost fallacy, put the monkey first—tackle the hardest part of the moonshot up front. Beginning with the monkey ensures that your moonshot has a good chance of becoming viable before you’ve poured massive amounts of resources into a project.
Here’s the thing: What’s easy often isn’t important, and what’s important often isn’t easy.
In the first part of this book (“Launch”), you learned how to reason from first principles and ignite your thinking by conducting thought experiments and taking moonshots to generate radical solutions to thorny problems.
When we’re familiar with a problem, and when we think we have the right answer, we stop seeing alternatives. This tendency is known as the Einstellung effect. In German, einstellung means “set,” and in this context, the term refers to a fixed mental set or attitude.
In a survey of 106 senior executives spanning ninety-one corporations in seventeen countries, 85 percent agreed or strongly agreed that their businesses were bad at defining problems and that this weakness, in turn, imposed significant costs.
“When you see a good move, don’t make it immediately. Look for a better one.”
In one famous study, Jacob Getzels and Mihaly Csikszentmihalyi found that the most creative art students spend more time in the preparation and discovery stage than do their less creative counterparts.
What they were seeing was an outcrop of bedrock right in front of the rover. Why would something as benign as bedrock leave a scientist speechless? An exposed, layered bedrock is the closest thing there is to time travel. A bedrock is like a history book. It shows us exactly what happened a long, long time ago, on this planet far, far away.
All of Opportunity’s big discoveries came within the first six weeks of the mission, thanks to its opportunistic landing site—which was made possible by our decision to send two rovers.
Opportunity—or Oppy, as we lovingly called it—kept going until June 2018, when a giant dust storm covered the rover’s solar panels, starving it of power. NASA officials sent hundreds of commands asking Oppy to call home, with no success. In February 2019, Opportunity was officially pronounced dead—over fourteen years into its ninety-day expected lifetime, having roamed a record-breaking twenty-eight miles on the red planet.26
But tools, as author Neil Gaiman reminds us, “can be the subtlest of traps.”33 Just because a hammer is sitting in front of you doesn’t mean it’s the right tool for the job. Only when you zoom out and determine the broader strategy can you walk away from a flawed tactic.
The teams that make the most money don’t use the five dollars at all. They realize that the five dollars is a distracting, and essentially worthless, resource.
the most valuable resource was the three-minute presentation time they had in front of a captive Stanford class. They sold their three-minute slot to a company interested in recruiting Stanford students and walked away with $650.
George de Mestral created Velcro after he saw his pants covered in cockleburs following a walk.39 He examined the cockleburs under a microscope and discovered a hooklike shape that he then emulated to create the hook-and-loop fastener called Velcro—with one side stiff like the cockleburs and the other side smooth like his pants.
As Amazon grew from an online bookstore to an “everything” store, it built up an immense electronic infrastructure, including storage and databases. The company realized that its infrastructure wasn’t simply an internal resource. It could also be sold to other companies as a cloud-computing service, to be used for storage, networking, and databases. AWS eventually became a cash cow for Amazon, generating roughly $17 billion in revenue in 2017—more than Amazon’s retail division.42
The answer was a resounding yes. It took only three years after the launch of Sputnik for the United States to implement this thought experiment and launch five satellites into orbit to guide its nuclear submarines. Although it was called the Transit system at the time, its name was changed in the 1980s to something that has become an everyday term: the global positioning system, or GPS.
Faraday came along and reversed Ørsted’s experiment. Instead of passing a wire with electric current over a magnet, he passed a magnet around a coil of wire. This generated an electrical current that grew bigger the faster he spun the magnet. Faraday’s reversal experiment gave way to modern hydroelectric and nuclear power plants, both of which use a magnetic turbine that generates electricity by turning a wire around.
Darwin adopted the same reversal mantra.49 While other field biologists looked for differences between species, Darwin searched for similarities. He compared, for example, the wing of a bird with the hand of a human. Exploring the similarities between otherwise vastly different species eventually culminated in the theory of evolution.
It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. —sherloCk holmes
The mind doesn’t follow the facts. Facts, as John Adams put it, are stubborn things, but our minds are even more stubborn. Doubt isn’t always resolved in the face of facts for even the most enlightened among us, however credible and convincing those facts might be.
Our tendency toward skewed judgment partly results from the confirmation bias. We undervalue evidence that contradicts our beliefs and overvalue evidence that confirms them. “It [is] a puzzling thing,” Robert Pirsig writes. “The truth knocks on the door and you say, ‘Go away, I’m looking for the truth,’ and so it goes away.”
Confirming our theories feels good. We get a hit of dopamine every time we’re proven right. In contrast, hearing opposing views is a genuinely unpleasant experience—so much so that people turn down cold, hard cash to remain in their ideological bubble.
When we seclude ourselves from opposing arguments, our opinions solidify, and it becomes increasingly harder to disrupt our established patterns of thinking.
“The problem here was not the error. It was the failure of NASA’s systems engineering, and the checks and balances in our processes to detect the error. That’s why we lost the spacecraft.” There was a gap—which went undetected—between the story the data told and the story the rocket scientists told themselves.
Regardless of your intelligence, Feynman’s adage holds true: “The first principle is that you must not fool yourself—and you are the easiest person to fool.”
Opinions are sticky. Once we form an opinion—our own very clever idea—we tend to fall in love with it, particularly when we declare it in public through an actual or a virtual megaphone. To avoid changing our mind, we’ll twist ourselves into positions that even seasoned yogis can’t hold.
When your beliefs and your identity are one and the same, changing your mind means changing your identity—which is why disagreements often turn into existential death matches.
then went back to my scientific training and began to reframe my opinions as working hypotheses. I changed my vocabulary to reflect this mental shift. At conferences, instead of saying “I argue …,” I began to say “This paper hypothesizes.…”
“The eye sees only what the mind is prepared to comprehend.”23 If the mind anticipates a single answer—the Mars Polar Lander may be alive—that’s what the eye will see.
If you can hold conflicting thoughts in your head and let them dance with each other, they’ll produce a symphony that will bring out additional notes—in the form of new ideas—far superior to the original ones.
A scientific theory is never proven right. It’s simply not proven wrong. Only when scientists work hard—but fail—to beat the crap out of their own ideas can they begin to develop confidence in those ideas.
“Nothing in the physical world seems to be constant or permanent,” physicist Alan Lightman writes. “Stars burn out. Atoms disintegrate. Species evolve. Motion is relative.”37 The same is true for facts. Most facts have a half-life. What we’re advised with confidence this year is reversed the next.
Consider, for example, the “simulation hypothesis,” first posited by philosopher Nick Bostrom and later popularized by Elon Musk. The hypothesis says we’re little creatures living in a computer simulation controlled by more-intelligent powers.41 This hypothesis isn’t falsifiable. If we’re like the characters in the video game The Sims, we can’t acquire information about our world from outside it. As a result, we can never prove that our world is not just an illusion.
When our focus shifts from proving ourselves right to proving ourselves wrong, we seek different inputs, we combat deeply entrenched biases, and we open ourselves up to competing facts and arguments. “I don’t like that man,” Abraham Lincoln is said to have observed. “I must get to know him better.” The same approach should apply to opposing arguments.
Daniel Kahneman, who won the Nobel prize in 2002 for his groundbreaking work on the psychology of judgment and decision making. Taking home the Nobel is an impressive feat, but it’s all the more impressive in Kahneman’s case. He won the prize for economics, and he’s a psychologist. “Most people after they win the Nobel Prize just want to go play golf,” explained Princeton professor Eldar Shafir. “Danny’s busy trying to disprove his own theories that led to the prize. It’s beautiful, really.”
In one study, participants became more critical of their own ideas when those ideas were presented to them as if they were someone else’s.
Everything we observe in the world is through our own eyes. What may be obvious to others—we’re swimming in water—isn’t obvious to us. Others have that seemingly freakish ability to spot the mismatch in our units of measurement or our collective delusion about a signal from a dead Martian lander.
This internet-fueled tribalism exacerbates our confirmation bias. As our echo chambers get louder and louder, we’re repeatedly bombarded with ideas that reiterate our own. When we see our own ideas mirrored in others, our confidence levels skyrocket. Opposing ideas are nowhere to be seen, so we assume they don’t exist or that those who adopt them must be irrational.
As a result, we must consciously step outside our echo chamber. Before making an important decision, ask yourself, “Who will disagree with me?” If you don’t know any people who disagree with you, make a point to find them. Expose yourself to environments where your opinions can be challenged, as uncomfortable and awkward as that might be.
If you can’t find opposing voices, manufacture them. Build a mental model of your favorite adversary, and have imaginary conversations with them.
In constructing a model of how an adversary thinks, you must be as objective and fair as possible. Avoid the instinct to caricature the opposing position, making it easier to debunk—a tactic called the straw man.
Instead of using a straw man tactic, engage in its opposite, the steel man. This approach requires you to find and articulate the strongest, not the weakest, form of the opposition’s argument. Charlie Munger, vice chairman of Berkshire Hathaway, is a major proponent of this idea. “You’re not entitled to take a view,” he cautions, “unless and until you can argue better against that view than the smartest guy who holds that opposite view.”

