More on this book
Kindle Notes & Highlights
by
Byrne Hobart
Read between
November 21 - November 26, 2024
No one could have foreseen such a trend. In the 1920s, the US was a relative academic backwater, particularly for physics. J. Robert Oppenheimer had a touch of thymotic megalomania 158—perhaps deservedly so—but even he had to admit that he didn’t know about quantum mechanics or electron spin until he studied at Cambridge.
For Americans studying physics, getting a PhD often required learning German. As late as 1933, there was only one American attendee at the prestigious Solvay Conference on physics and chemistry in Brussels: Ern...
This highlight has been truncated due to consecutive passage length restrictions.
And, for a while, it made a tiny town in New Mexico one of the most cosmopolitan places on Earth, where one could overhear conversations in English, German, Hungarian, Polish, Russian, Italian, and Dutch (and, courtesy of Oppenheimer, the occasional monologue in Sanskrit).
In October, Sachs finally got his meeting, summarized the letter, and made his case. FDR’s response: “This requires action.” Soon after, the Manhattan Project was launched.
The Allies even sent Moe Berg, a professional baseball player turned spy, to Switzerland to listen to a lecture by German physicist Werner Heisenberg to determine whether it sounded like Germany would be able to develop the bomb before the end of the war. If the answer was yes, he was to assassinate Heisenberg.
The physicists working in Los Alamos did not all agree on the new goal, but an endeavor with the Manhattan Project’s scope tends to synthesize motivations and coordinate behavior. This is another striking feature of bubbles: When they expand, differences among participants are felt less acutely. Under the urgency of war, Soviet-sympathizing scientists could keep company with emerging Cold War warriors. Some project participants, including Oppenheimer, had dabbled in left-wing causes; one, Klaus Fuchs, would even pass atomic secrets to the Soviets.
During the Manhattan Project, it’s probable that Los Alamos contained the largest concentration of brainpower ever assembled. So high was the bar for participation that the choice of Oppenheimer, a non-Nobelist, to serve as the scientific leader of the Los Alamos site was controversial, since his role would involve directing several Nobel winners, including Bohr, Fermi, Lawrence, James Chadwick, and Isidor Rabi.
These initial worries proved unfounded, as Oppenheimer played a crucial role in catalyzing this concentration of genius. An ascetic polymath who was equally comfortable discussing sonnets as he was cyclotrons, Oppenheimer had been put in charge at Los Alamos because he possessed expertise and charisma. In one case, the physicist Robert Serber, after hearing a single lecture from Oppenheimer, immediately left his role at Princeton to join Oppenheimer at Berkeley. When Oppenheimer relocated from Berkeley to Los Alamos, Serber followed.
Feynman was not the only risk-taker at Los Alamos. A notable factor on this score is the youth of the project participants—the average age of experimenters and engineers at Los Alamos was just 25. 164 Youth, as most of us recall from our teenage years, breeds a risk-taking mentality.
The parallel approach to uranium enrichment increased the risk that the amount of money, time, and material wasted on fruitless efforts would be significant. But it lowered the more important risk that the bomb would not be completed at all. Bubbles have a paradoxical relationship with risk. The more people who dive in with both feet, the more likely it is that their collective efforts will eventually produce something that makes all the work and risk worthwhile.
Oppenheimer was in the extremely rare position of being personally acquainted with von Neumann, familiar with using computers to solve practical problems, and able to fund new projects. After the war, Oppenheimer took charge of IAS and used the institute’s money to pay for von Neumann’s work. Von Neumann agreed not to patent his computer, which is fortunate. His design, known as the von Neumann architecture, revolutionized computing and still describes the basic structure of modern machines.
Nuclear competition also prompted the development of game theory, an important branch of math and economics also invented by von Neumann. In a world of mutually assured destruction in which a state could retaliate against a devastating nuclear attack but could not stop an assault already underway, academics undertook more serious decision-making models that incorporated known incentives and uncertain information. Today, game theory underpins the ad pricing models of major internet companies like Alphabet, Amazon, and Facebook. And, to really bring the example full circle, Alphabet invests some
...more
Once the US, the USSR, and other countries had access to powerful bombs, they sought to demonstrate that these bombs could be delivered swiftly and accurately. Apollo was a plausible way to do this. It allowed the US to show that it could deliver a warhead-sized payload anywhere in the solar system—including Moscow.
GDP in 1939 through 1945 averaged $163 billion. In today’s dollars, as a percentage of GDP, the Manhattan Project would cost about $250 billion.
While the Nazis targeted academia in part because many prominent Jewish people worked in the field and had made contributions to physics, it is worth noting that academia is, in general, sensitive to oppressive ideologies because universities are directly dependent on the tolerance and largesse of the state.
As we’ve already seen with the Manhattan Project, arbitrary deadlines are a powerful way to turn an approximate but extreme ambition into a specific goal.
The success of Apollo speaks to what Peter Thiel has called “definite optimism,” or the belief that the future will be better than the present for specific, concrete reasons—as contrasted with “indefinite optimism,” the view that GDP will grow and standards of living will improve without the help of any specific change.
From an indefinite optimist perspective, it’s hard to envision the steps necessary to achieve actual improvements. A definite vision of the future gives us a project to undertake in the here and now, which provides a roadmap for how we can create a better future, rather than simply waiting for its arrival and hoping for the best.
The Soviets took a different approach. Rather than frantically show off their bombs, they developed a better way to deliver them. In 1957 they revealed the intercontinental ballistic missile (ICBM), a rocket that could be launched from Soviet territory and, traveling at 15,000 miles per hour, arrive at a target halfway around the globe in 15 minutes. That prompted the Americans to begin testing their first ICBM, Atlas. But the first Atlas test blew up, as did the next five.
In retrospect, the ambition and scale of Apollo seems almost delusional. How could anyone believe that we’d land on the Moon and immediately start colonizing the solar system? It was akin to believing that tulips, real estate, and cryptocurrencies could only go up in value.
But Apollo was also a bubble in another sense. Its growth followed bubble-like dynamics, leading researchers to describe it as a social bubble—a bubble that unfolded outside financial markets, inflated not by speculators but by technocrats and politicians.
The greatest risk of manned space missions is, as one former NASA astronaut put it, “not death” but “not to explore at all.” 190 Worse still is the idea that one might miss out on being the first to reach the next frontier—FOMO on a cosmic level.
NASA developed, implemented, and perfected an approach known as systems management. Originating in the air defense and ballistic missile programs of the 1950s, the strategy focuses on the system itself—its boundaries, interactions between subunits, and optimal performance—rather than on sequential processes or the work of subunits. Leaders with systemic vision avoid micromanaging and are better able to accelerate a team’s overall progress.
In von Braun’s system of “Monday notes,” engineers and technicians were required to identify the most salient issues and submit a single-page note. After leaving comments in the margins, von Braun would circulate the entire annotated collection of notes within the organization. Through this informal system, everyone was able to tap into the organization’s collective knowledge and contribute solutions to each other’s problems. There was one level of centralization, with von Braun serving as the hub for information, but his role was really to highlight problems in a way that facilitated
...more
In contrast to NASA’s later mantra—“In God we trust. All others must bring data”—Kranz tried to systematically incorporate the gut feelings and hunches of his colleagues into the decision-making process. When two or more technicians or engineers, no matter their position in the hierarchy, shared a concern, he stopped everything to collect and analyze data that would either isolate the problem or rule it out. Sometimes, these qualitative inputs were enough to override a rigorous protocol.
By contrast, when the Challenger exploded in 1986, NASA was heavily reliant on data communication and visualization protocols, hierarchy, and a culture of conformity, all of which were implicated in the accident.
Yet there is a downside to the LOR story that speaks to NASA’s post-Apollo descent into bureaucratic irrelevance. The adoption of LOR came at the expense of an alternative possibility with far greater potential: Earth-orbit rendezvous. This approach was von Braun’s preference and underpinned his master plan for interstellar travel. LOR was more efficient for the lunar mission specifically, but the Earth-orbit equivalent placed no constraints on where a rocket could travel. In other words, in service of a successful Moon landing and a few uninspiring follow-up trips, we abandoned the best path
...more
Indeed, NASA needed so many integrated circuits that, by the end of 1963, NASA purchased 60 percent of all integrated circuits manufactured in the US. 198 It would be fair to say the Apollo program created the semiconductor industry by driving down the price of chips.
Moore’s law has radically reshaped the trajectory of technological progress over the past half-century. As we will show in the next chapter, it became a self-fulfilling prophecy that resulted in the exponential acceleration of computational power, which brought us the iPhone and Instagram—which are, ironically, distracting large segments of the teenage population from aerospace engineering.
In the mid-1960s, the only demand for integrated circuits that Moore identified in his paper was “Apollo, for manned Moon flight.” 200 By 1969, however, the year Neil Armstrong and Buzz Aldrin walked on the lunar surface, the market for computer chips was 80 times larger than it had been in 1962.
Asked about the purpose of a manned Mars mission, von Braun once remarked that the question of “what we are going to do… once we get there” was a “weak point.” 202 But such questions are missing one of the most important lessons from Apollo. Manned space flight needs to be evaluated less in terms of economic efficiency and more in terms of spiritual effectiveness.
When Facebook was growing, one of its infrastructure goals was to always be able to support 10 times the site’s then-current usage. There’s nothing special about the number 10, apart from the evolutionary accident that we have 10 fingers, but this heuristic turned a vague goal (“Always be ready for growth”) into a highly specific one (“If we’re using 11 percent of our capacity, we need to buy more servers now”). And for Apollo, it meant that instead of getting to the Moon “soon” or “as quickly as possible,” there was a date after which it would be too late.
A few days after the Moon landing, Wernher von Braun prepared a plan for manned Mars missions, which included lunar-orbital space stations, Mars surface bases, and nuclear-powered Earth–Moon shuttles. The plan, which had the public support of the vice president and which von Braun presented to the US Senate, stated that ships for Mars would launch on November 12, 1981. While setting an ultra-ambitious deadline worked for Apollo, von Braun was skeptical that the US Senate would support it this time. A few decades later, Elon Musk became known for implementing a similar culture of
...more
Over that period, chip fabrication has become intensely specialized. A single fabrication plant takes years to build and costs upward of $15 billion, more than an aircraft carrier or a nuclear power plant.
This colossal expenditure is required to build a facility that manipulates machines whose components are a few dozen atoms wide and whose performance is increasingly limited not by mechanical accuracy but by obscure quantum effects and extreme precision manufacturing.
Moore’s law is perhaps the most compelling and enduring example of a two-sided bubble, wherein the expectation of progress in one domain spurs progress in another, which then propels growth in the first. Moore’s law was made possible by the technical ability to cram more transistors onto each chip. It became a reality because software makers responded by building products that assumed further drops in hardware costs.
Cars, for example, use an increasing number of sensors for safety and fuel efficiency, to the point that electronics already comprise 40 percent of a new vehicle’s cost (up from 18 percent in 2000). 211
But chip designers really only needed to believe that they could finish the next generation or two on their roadmap. 214 Then, having done this a few times, they’d develop a sense that this process would continue. Look four or five generations out, and keeping Moore’s law going seems like an impossible task; look one generation out, and it’s a worthy challenge, not an impossible one. (This dynamic may be familiar to anyone who has flipped to the middle of a new math textbook, seen an impenetrable blizzard of obscure symbols and notations, and then, once they actually reach that section, found
...more
The rapid pace of advancement in physics is hard to comprehend today. Walter Brattain, one of the co-discoverers of the transistor, recalls that when he was in graduate school at the University of Minnesota, “quantum mechanics was changing so fast that every student audited [physicist and mathematician John Hasbrouck] Van Vleck’s course every year.” 215 Similarly, when Brattain’s manager, William Shockley, attended Caltech, Shockley’s professors were learning new theories with each round of textbooks, which they would immediately teach to their students. 216 A diligent student who read a
...more
Another Bell Labs researcher, Henry Theurer, realized that the unusual smell of acetylene lights was caused by small impurities of phosphorus. It was a very lucky break. At the time, spectrometers were not sensitive enough to detect impurities that small; the only scientific instrument capable of identifying the accidental dopant was the human nose.
Shockley Semiconductor looked like an ideal startup. It was well funded, its founder was widely respected in the physics community, and its product had growing demand. Shockley had his pick of the smartest graduates in physics and chemistry. His pitch became even more convincing when, soon after starting the company, he was awarded the Nobel Prize in Physics along with Brattain and Bardeen. But while Shockley excelled as a theorist, he had serious deficiencies as a manager.
While the traitorous eight were busy setting up shop in the Bay Area, a small firm in Dallas purchased a license to build transistors from Bell Labs and began experimenting with them. Called Texas Instruments, the company had been formed in 1930 to build equipment for oil exploration. It had survived the tumultuous economy of the Depression and grown into a small-scale but respectable electronics firm. 221 In 1954, Texas Instruments became the first company to successfully commercialize silicon transistors.
Fairchild’s optimism was ruthlessly contagious—buy in to the increasingly inevitable transistorized future, or be left behind.
However, because Sony faced a greater amount of existential risk—it had the attention of Japan’s powerful industrial planners, who would be anxious to see a return on the investment in a license—the company worked harder to sell the radios, which became a breakout export product.
Although Texas Instruments didn’t directly profit from the radio, the product was a gateway to a much larger payoff. IBM’s Thomas Watson showed transistor radios to senior executives, telling them that a company that could build transistors for radios could also do so for computers. By 1957, Texas Instruments had signed a contract with IBM to supply transistors for its computers.
But a spacecraft that needed to contain people, not to mention keep them alive, faced a key constraint: weight. NASA was willing to buy the smallest computers available, even if they were quite expensive, because doing so was cheaper than the cost of sending a few extra pounds into orbit.
This is a common feature of new technologies. The initial killer app may have little to do with the ultimate use case, but it provides enough demand to scale production (and convince the creators they’re on to something).
The first use case for the steam engine, for example, was pumping water out of flooded mines, not transportation. The early model for the automobile and personal computer industries was hobbyists, who weren’t looking for a practical device but something that was fun to use. Airlines in the 1920s made their m...
This highlight has been truncated due to consecutive passage length restrictions.
As a result, in 1968, Moore, Noyce, and other Fairchild Semiconductor employees decided to strike out on their own—again. Instead of working under a cash-rich but strategically unfocused parent company, they’d build their own venture. Initially, Moore and Noyce wanted to name the company after themselves and flipped a coin to decide whose name would go first. However, realizing that “Moore-Noyce” sounded like “more noise”—a poor omen for a business that would sell components for precision instruments—they opted for the vaguer “Intel.”
Fairchild and Intel pioneered some of the management techniques that would later become standard in the chip industry. For example, when there were competing options for performing a task, such as for a potential chip design or a production method, the companies would often assign independent teams to work on multiple models. Whichever one worked best would be implemented at scale. This created some waste and duplication, but it also meant that launches weren’t delayed; parallelizing the process ensured that something would be ready by the target launch date.