Chip War: The Fight for the World's Most Critical Technology
Rate it:
Open Preview
Kindle Notes & Highlights
1%
Flag icon
The United States still has a stranglehold on the silicon chips that gave Silicon Valley its name, though its position has weakened dangerously. China now spends more money each year importing chips than it spends on oil.
2%
Flag icon
World War II was decided by steel and aluminum, and followed shortly thereafter by the Cold War, which was defined by atomic weapons. The rivalry between the United States and China may well be determined by computing power. Strategists in Beijing and Washington now realize that all advanced tech—from machine learning to missile systems, from automated vehicles to armed drones—requires cutting-edge chips, known more formally as semiconductors or integrated circuits. A tiny number of companies control their production. We rarely think about chips, yet they’ve created the modern world. The fate ...more
2%
Flag icon
Every button on your iPhone, every email, photograph, and YouTube video—all of these are coded, ultimately, in vast strings of 1s and 0s. But these numbers don’t actually exist. They’re expressions of electrical currents, which are either on (1) or off (0). A chip is a grid of millions or billions of transistors, tiny electrical switches that flip on and off to process these digits, to remember them, and to convert real world sensations like images, sound, and radio waves into millions and millions of 1s and 0s.
2%
Flag icon
Fabricating and miniaturizing semiconductors has been the greatest engineering challenge of our time. Today, no firm fabricates chips with more precision than the Taiwan Semiconductor Manufacturing Company, better known as TSMC.
2%
Flag icon
Fairchild cofounder Gordon Moore noticed in 1965 that the number of components that could be fit on each chip was doubling annually as engineers learned to fabricate ever smaller transistors. This prediction—that the computing power of chips would grow exponentially—came to be called “Moore’s Law” and led Moore to predict the invention of devices that in 1965 seemed impossibly futuristic, like an “electronic wristwatch,” “home computers,” and even “personal portable communications equipment.”
2%
Flag icon
In 1970, the second company Moore founded, Intel, unveiled a memory chip that could remember 1,024 pieces of information (“bits”). It cost around $20, roughly two cents per bit. Today, $20 can buy a thumb drive that can remember well over a billion bits.
2%
Flag icon
When we think of Silicon Valley today, our minds conjure social networks and software companies rather than the material after which the valley was named. Yet the internet, the cloud, social media, and the entire digital world only exist because engineers have learned to control the most minute movement of electrons as they race across slabs of silicon. “Big tech” wouldn’t exist if the cost of processing and remembering 1s and 0s hadn’t fallen by a billionfold in the past half century.
2%
Flag icon
California’s culture mattered just as much as any economic structure, however. The people who left America’s East Coast, Europe, and Asia to build the chip industry often cited a sense of boundless opportunity in their decision to move to Silicon Valley. For the world’s smartest engineers and most creative entrepreneurs, there was simply no more exciting place to be.
2%
Flag icon
Other countries have found it impossible to keep up on their own but have succeeded when they’ve deeply integrated themselves into Silicon Valley’s supply chains.
2%
Flag icon
This strategy has yielded certain capabilities that no other countries can replicate—but they’ve achieved what they have in partnership with Silicon Valley, continuing to rely fundamentally on U.S. tools, software, and customers. Meanwhile, America’s most successful chip firms have built supply chains that stretch across the world, driving down costs and producing the expertise that has made Moore’s Law possible.
3%
Flag icon
Taiwan’s TSMC builds almost all the world’s most advanced processor chips. When COVID slammed into the world in 2020, it disrupted the chip industry, too. Some factories were temporarily shuttered. Purchases of chips for autos slumped. Demand for PC and data center chips spiked higher, as much of the world prepared to work from home. Then, over 2021, a series of accidents—a fire in a Japanese semiconductor facility; ice storms in Texas, a center of U.S. chipmaking; and a new round of COVID lockdowns in Malaysia, where many chips are assembled and tested—intensified these disruptions. Suddenly, ...more
3%
Flag icon
In the age of AI, it’s often said that data is the new oil. Yet the real limitation we face isn’t the availability of data but of processing power. There’s a finite number of semiconductors that can store and process data. Producing them is mind-bogglingly complex and horrendously expensive. Unlike oil, which can be bought from many countries, our production of computing power depends fundamentally on a series of choke points: tools, chemicals, and software that often are produced by a handful of companies—and sometimes only by one. No other facet of the economy is so dependent on so few ...more
3%
Flag icon
The global network of companies that annually produces a trillion chips at nanometer scale is a triumph of efficiency. It’s also a staggering vulnerability. The disruptions of the pandemic provide just a glimpse of what a single well-placed earthquake could do to the global economy.
3%
Flag icon
Yet the seismic shift that most imperils semiconductor supply today isn’t the crash of tectonic plates but the clash of great powers. As China and the United States struggle for supremacy, both Washington and Beijing are fixated on controlling the future of computing—and, to a frightening degree, that future is dependent on a small island that Beijing considers a renegade province and America has committed to defend by force.
3%
Flag icon
the concentration of advanced chip manufacturing in Taiwan, South Korea, and elsewhere in East Asia isn’t an accident. A series of deliberate decisions by government officials and corporate executives created the far-flung supply chains we rely on today.
3%
Flag icon
To understand how our world came to be defined by quintillions of transistors and a tiny number of irreplaceable companies, we must begin by looking back to the origins of the silicon age.
4%
Flag icon
More accuracy required more calculations. Engineers eventually began replacing mechanical gears in early computers with electrical charges. Early electric computers used the vacuum tube, a lightbulb-like metal filament enclosed in glass. The electric current running through the tube could be switched on and off, performing a function not unlike an abacus bead moving back and forth across a wooden rod. A tube turned on was coded as a 1 while a vacuum tube turned off was a 0. These two digits could produce any number using a system of binary counting—and therefore could theoretically execute ...more
4%
Flag icon
This was a leap forward in computing—or it would have been, if not for the moths. Because vacuum tubes glowed like lightbulbs, they attracted insects, requiring regular “debugging” by their engineers. Also like lightbulbs, vacuum tubes often burned out.
4%
Flag icon
So long as computers were moth-ridden monstrosities, they’d only be useful for niche applications like code breaking, unless scientists could find a smaller, faster, cheaper switch.
5%
Flag icon
An only child, he was utterly convinced of his superiority over anyone around him—and he let everyone know it.
5%
Flag icon
Semiconductors, Shockley’s area of specialization, are a unique class of materials. Most materials either let electric current flow freely (like copper wires) or block current (like glass). Semiconductors are different. On their own, semiconductor materials like silicon and germanium are like glass, conducting hardly any electricity at all. But when certain materials are added and an electric field is applied, current can begin to flow. Adding phosphorous or antimony to semiconducting materials like silicon or germanium, for example, lets a negative current flow. “Doping” semiconductor ...more
5%
Flag icon
Shockley soon built such a device, expecting that applying and removing an electric field on top of the piece of silicon could make it function like a valve, opening and closing the flow of electrons across the silicon. When he ran this experiment, however, he was unable to detect a result. “Nothing measurable,” he explained. “Quite mysterious.” In fact, the simple instruments of the 1940s were too imprecise to measure the tiny current that was flowing. Two years later, two of Shockley’s colleagues at Bell Labs devised a similar experiment on a different type of device. Where Shockley was ...more
5%
Flag icon
Because transistors could amplify currents, it was soon realized, they would be useful in devices such as hearing aids and radios, replacing less reliable vacuum tubes, which were also used for signal amplification. Bell Labs soon began arranging patent applications for this new device. Shockley was furious that his colleagues had discovered an experiment to prove his theories, and he was committed to outdoing them.
5%
Flag icon
By January 1948, he’d conceptualized a new type of transistor, made up of three chunks of semiconductor material.
5%
Flag icon
Shockley began to perceive other uses, along the lines of the “solid state valve” he’d previously theorized. He could turn the larger current on and off by manipulating the small current applied to the middle of this transistor sandwich. On, off. On, off. Shockley had designed a switch.
5%
Flag icon
When Bell Labs held a press conference in June 1948 to announce that its scientists had invented the transistor, it wasn’t easy to understand why these wired blocks of germanium merited a special announcement. The New York Times buried the story on page 46. Time magazine did better, reporting the invention under the headline “Little Brain Cell.” Yet even Shockley, who never underestimated his own importance, couldn’t have imagined that soon thousands, millions, and billions of these transistors would be employed at microscopic scale to replace human brains in the task of computing.
6%
Flag icon
With time to tinker, he wondered how to reduce the number of wires that were needed to string different transistors together. Rather than use a separate piece of silicon or germanium to build each transistor, he thought of assembling multiple components on the same piece of semiconductor material. When his colleagues returned from summer vacation, they realized that Kilby’s idea was revolutionary. Multiple transistors could be built into a single slab of silicon or germanium. Kilby called his invention an “integrated circuit,” but it became known colloquially as a “chip,” because each ...more
6%
Flag icon
Shockley had a knack for spotting talent, but he was an awful manager. He thrived on controversy and created a toxic atmosphere that alienated the bright young engineers he’d assembled. So these eight engineers left Shockley Semiconductor and decided to found their own company, Fairchild Semiconductor, with seed funding from an East Coast millionaire. The eight defectors from Shockley’s lab are widely credited with founding Silicon Valley. One of the eight, Eugene Kleiner, would go on to found Kleiner Perkins, one of the world’s most powerful venture capital firms. Gordon Moore, who went on to ...more
6%
Flag icon
Because the planar process covered the transistor with an insulating layer of silicon dioxide, Noyce could put “wires” directly on the chip by depositing lines of metal on top of it, conducting electricity between the chip’s transistors. Like Kilby, Noyce had produced an integrated circuit: multiple electric components on a single piece of semiconductor material. However, Noyce’s version had no freestanding wires at all. The transistors were built into a single block of material. Soon, the “integrated circuits” that Kilby and Noyce had developed would become known as “semiconductors” or, more ...more
6%
Flag icon
Noyce and Moore began to realize that miniaturization and electric efficiency were a powerful combination: smaller transistors and reduced power consumption would create new use cases for their integrated circuits. At the outset, however, Noyce’s integrated circuit cost fifty times as much to make as a simpler device with separate components wired together. Everyone agreed Noyce’s invention was clever, even brilliant. All it needed was a market.
6%
Flag icon
Across America, the Soviet space program caused a crisis of confidence. Control of the cosmos would have serious military ramifications. The U.S. thought it was the world’s science superpower, but now it seemed to have fallen behind. Washington launched a crash program to catch up with the Soviets’ rocket and missile programs, and President John F. Kennedy declared the U.S. would send a man to the moon. Bob Noyce suddenly had a market for his integrated circuits: rockets.
6%
Flag icon
The computer that eventually took Apollo 11 to the moon weighed seventy pounds and took up about one cubic foot of space, a thousand times less than the University of Pennsylvania’s ENIAC computer that had calculated artillery trajectories during World War II.
6%
Flag icon
Chip sales to the Apollo program transformed Fairchild from a small startup into a firm with one thousand employees. Sales ballooned from $500,000 in 1958 to $21 million two years later. As Noyce ramped up production for NASA, he slashed prices for other customers. An integrated circuit that sold for $120 in December 1961 was discounted to $15 by next October. NASA’s trust in integrated circuits to guide astronauts to the moon was an important stamp of approval. Fairchild’s Micrologic chips were no longer an untested technology; they were used in the most unforgiving and rugged environment: ...more
7%
Flag icon
Winning the Minuteman II contract transformed TI’s chip business. TI’s integrated circuit sales had previously been measured in the dozens, but the firm was soon selling them by the thousands amid fear of an American “missile gap” with the Soviet Union. Within a year, TI’s shipments to the Air Force accounted for 60 percent of all dollars spent buying chips to date. By the end of 1964, Texas Instruments had supplied one hundred thousand integrated circuits to the Minuteman program. By 1965, 20 percent of all integrated circuits sold that year went to the Minuteman program. Pat Haggerty’s bet ...more
7%
Flag icon
Lathrop called the process photolithography—printing with light. He produced transistors much smaller than had previously been possible, measuring only a tenth of an inch in diameter, with features as small as 0.0005 inches in height. Photolithography made it possible to imagine mass-producing tiny transistors. Lathrop applied for a patent on the technique in 1957.
7%
Flag icon
Haggerty and Kilby realized that light rays and photoresists could solve the mass-production problem, mechanizing and miniaturizing chipmaking in a way that soldering wires together by hand could not.
8%
Flag icon
A master bridge player, Chang approached manufacturing as methodically as he played his favorite card game. Upon arriving at TI, he began systematically tweaking the temperature and pressure at which different chemicals were combined, to determine which combinations worked best, applying his intuition to the data in a way that amazed and intimidated his colleagues. “You had to be careful when you worked with him,” remembered one colleague. “He sat there and puffed on his pipe and looked at you through the smoke.” The Texans who worked for him thought he was “like a Buddha.” Behind the tobacco ...more
8%
Flag icon
The Nobel Prize for inventing the transistor went to Shockley, Bardeen, and Brattain. Jack Kilby later won a Nobel for creating the first integrated circuit; had Bob Noyce not died at the age of sixty-two, he’d have shared the prize with Kilby. These inventions were crucial, but science alone wasn’t enough to build the chip industry. The spread of semiconductors was enabled as much by clever manufacturing techniques as academic physics. Universities like MIT and Stanford played a crucial role in developing knowledge about semiconductors, but the chip industry only took off because graduates of ...more
8%
Flag icon
Now that he was running Fairchild, a company seeded by a trust-fund heir, he had flexibility to treat the military as a customer rather than a boss. He chose to target much of Fairchild’s R&D not at the military, but at mass market products. Most of the chips used in rockets or satellites must have civilian uses, too, he reasoned. The first integrated circuit produced for commercial markets, used in a Zenith hearing aid, had initially been designed for a NASA satellite. The challenge would be making chips that civilians could afford. The military paid top dollar, but consumers were price ...more
8%
Flag icon
In 1965, Moore was asked by Electronics magazine to write a short article on the future of integrated circuits. He predicted that every year for at least the next decade, Fairchild would double the number of components that could fit on a silicon chip. If so, by 1975, integrated circuits would have sixty-five thousand tiny transistors carved into them, creating not only more computing power but also lower prices per transistor. As costs fell, the number of users would grow. This forecast of exponential growth in computing power soon came to be known as Moore’s Law. It was the greatest ...more
9%
Flag icon
Defense contractors thought about chips mostly as a product that could replace older electronics in all the military’s systems. At Fairchild, Noyce and Moore were already dreaming of personal computers and mobile phones. When U.S. defense secretary Robert McNamara reformed military procurement to cut costs in the early 1960s, causing what some in the electronics industry called the “McNamara Depression,” Fairchild’s vision of chips for civilians seemed prescient.
9%
Flag icon
In 1966, Burroughs, a computer firm, ordered 20 million chips from Fairchild—more than twenty times what the Apollo program consumed. By 1968, the computer industry was buying as many chips as the military. Fairchild chips served 80 percent of this computer market. Bob Noyce’s price cuts had paid off, opening a new market for civilian computers that would drive chip sales for decades to come. Moore later argued that Noyce’s price cuts were as big an innovation as the technology inside Fairchild’s integrated circuits. By the end of the 1960s, after a decade of development, Apollo 11 was finally ...more
9%
Flag icon
Soviet leader Nikita Khrushchev was committed to outcompeting the United States in every sphere, from corn production to satellite launches. Khrushchev himself was more comfortable on collective farms than in electronics labs. He understood nothing about technology but was obsessed with the notion of “catching up and overtaking” the United States, as he repeatedly promised to do.
9%
Flag icon
In the 1930s, Barr and Sarant were integrated into an espionage ring led by Julius Rosenberg, the infamous Cold War spy. During the 1940s, Barr and Sarant worked on classified radars and other military systems at Western Electric and Sperry Gyroscope, two leading American technology firms. Unlike others in the Rosenberg ring, Barr and Sarant didn’t possess nuclear weapons secrets, but they had gained intimate knowledge about the electronics in new weapons systems.
10%
Flag icon
Before the FBI could catch them, Sarant and Barr fled the country, eventually reaching the Soviet Union. When they arrived, they told KGB handlers they wanted to build the world’s most advanced computers. Barr and Sarant weren’t experts in computers, but nor was anyone else in the Soviet Union. Their status as spies was, in itself, a much admired credential, and their aura gave them access to resources. In the late 1950s, Barr and Sarant began building their first computer, called UM—the Russian word for “mind.” Their work attracted the attention of Shokin, the bureaucrat who managed the ...more
10%
Flag icon
Khrushchev was enamored of grand projects, especially those that he could claim credit for, so he enthusiastically endorsed the idea of building a Soviet city for semiconductors. He embraced Barr and Sarant in a bear hug, promising his full support. Several months later, the Soviet government approved plans to build a semiconductor city in the outskirts of Moscow. “Microelectronics is a mechanical brain,” Khrushchev explained to his fellow Soviet leaders. “It is our future.”
10%
Flag icon
The USSR excelled in quantity but not in quality or purity, both of which were crucial to high-volume chipmaking.
10%
Flag icon
Spying could only get Shokin and his engineers so far. Simply stealing a chip didn’t explain how it was made, just as stealing a cake can’t explain how it was baked. The recipe for chips was already extraordinarily complicated. Foreign exchange students studying with Shockley at Stanford could become smart physicists, but it was engineers like Andy Grove or Mary Anne Potter who knew at what temperature certain chemicals needed to be heated, or how long photoresists should be exposed to light. Every step of the process of making chips involved specialized knowledge that was rarely shared ...more
This highlight has been truncated due to consecutive passage length restrictions.
10%
Flag icon
Zelenograd might have seemed like Silicon Valley without the sunshine. It had the country’s best scientists and stolen secrets. Yet the two countries’ semiconductor systems couldn’t have been more different. Whereas Silicon Valley’s startup founders job-hopped and gained practical “on the factory floor” experience, Shokin called the shots from his ministerial desk in Moscow. Yuri Osokin, meanwhile, lived in obscurity in Riga, highly respected by his colleagues but unable to speak about his invention with anyone who lacked a security clearance.
11%
Flag icon
Meanwhile, the “copy it” mentality meant, bizarrely, that the pathways of innovation in Soviet semiconductors were set by the United States. One of the most sensitive and secretive industries in the USSR therefore functioned like a poorly run outpost of Silicon Valley. Zelenograd was just another node in a globalizing network—with American chipmakers at the center.
« Prev 1