More on this book
Kindle Notes & Highlights
by
Byrne Hobart
Read between
November 21 - November 26, 2024
This approach was similar to the Manhattan Project’s decision to pursue numerous sources of fissile material in case one of them didn’t work; if the goal was to deliver something that worked as quickly as possible (and if the consequences of success were extreme enough), waste was an acceptable price for making that happen.
But one shining exception to the general rule against graphing recent historical data into the future is Gordon Moore’s 1965 paper, “Cramming More Components Onto Integrated Circuits.” Moore looked at the transistor density of four recent chips released in the prior three years and observed that it had doubled every year. He argued that this trend could continue for at least the next 10 years. A decade later, he revised his prediction, speculating that chip density would continue to double every two years.
Moore’s law represents a fundamental shift in the history of the chip. Until that point, the saga of the chip was a narrative propelled by technology. After Moore’s observation, it became a technology propelled by narrative.
Critically, this view of the future was not about where an industry was in the sine wave between recession and boom but a view about the continuation of a positive inflection in capabilities.
While Intel was initially uncertain about the value of the product during negotiations, it turned out to be a crucial innovation. First, it meant Intel possessed a chip that could suit a variety of niche needs that the company didn’t need to identify in advance. Second, it reduced the lead time for new products, because instead of designing circuits around a product and then producing the product, customers could build a program based on a known hardware spec and then buy chips off the shelf. Finally, it enabled the PC revolution.
This is not just a story about early-stage hustle. It’s also indicative of the nature of the computer industry’s growth. Software needs to run on hardware, but given enough processing power, a software program can simulate the hardware on which it’s expected to run. Gates and Allen had, as part of a previous venture, built an 8008 emulator on similar principles. Once they updated their emulator, they were able to begin building a new BASIC.
When you have the microprocessor doubling in power every two years, in a sense you can think of computer power as almost free. So you ask, why be in the business of making something that’s almost free? What is the scarce resource? What is it that limits being able to get value out of that infinite computing power? Software.
A 70-year logarithmic chart makes advances in chip density look like a smooth process, but each advance was a discrete, step-function change. Companies that were even a few months early or late would miss out on sales but would still bear the fixed cost of research and development as well as their capital expenditures.
This rise in fixed costs changed the nature of the business. When capital costs were low and the primary customer was NASA, pure technological wizardry was rewarded. As chips made their way into more consumer applications and the cost of starting a fab rose, the industry began to reward strategic brilliance. It’s no coincidence that Intel’s Andy Grove wrote the canonical book on tech company strategy—nor is it a coincidence that he called it Only the Paranoid Survive.
Some companies were still equity financed and aimed for disproportionate profits with high risk, while a growing number were debt financed and tried to compete with manufacturers of existing products on cost and quality. This dynamic began to resemble a mean-reversion bubble model in which the expectation was that the world would demand the same products but in greater quantities and at lower costs.
Japanese brands like Sony, Toshiba, JVC, Sharp, and Canon recognized that a large proportion of the incremental profit of their goods went to companies like Intel rather than to domestic companies. Take a new calculator. While the device might have a 15 percent profit margin, the chips inside the calculator had incremental margins of 80 percent or higher and were hard or impossible to substitute. So, starting in the 1960s, Japanese companies began attempting to produce their own chips.
As with many other bubbles, they took a somewhat faith-based approach. Not only did TSMC have to believe demand for chip fabrication would continue to exist, it had to believe that this demand wouldn’t be met by companies with their own fabrication capabilities. While smaller chip companies were getting priced out of fabricating their own chips, consensus in the industry held that chip companies needed to do their own manufacturing as well as design. As Jerry Sanders, AMD’s CEO and a member of the Fairchild diaspora, once put it: “Now hear me and hear me well. Real men have fabs!”
Frontiers have a way of disappearing, though. Over time, rising foundry costs have reshaped the industry. With each new generation of chips, more companies can design them while fewer are equipped to manufacture them. Today, only Samsung and TSMC can make the most recent generation of chips, while Intel is years behind their efforts.
The recycling of profits from one epoch-defining company into early-stage investment in the next generation of businesses has been an important force in the technology industry. Google was backed by money from Sun Microsystems, while Facebook received support from a cofounder of PayPal. Fairchild Camera also benefited from this cycle, since the founder’s father was an early executive at IBM.
Every technology cycle leaves behind a residue of solved problems that make the same iteration of an idea both more likely to succeed and more likely to face competition. The first e-commerce companies, for example, had to figure out how to accept payments online—eBay originally assessed fees on an honor system and asked customers to mail in checks—but now third-party services solve this problem. So solving payments is no longer something founders have to think about, but it’s also one less potential competitive advantage.
The relationship between what Michael Malone calls “the Intel trinity” in his 2014 book of the same name, especially between the more abrasive and focused Grove and the kind and charismatic Noyce, was sometimes fraught. Complementary skills often involve clashing personalities. One function of a company like Intel is that it forces people with varying traits and goals to collaborate on the same project. Just as an attractive force balances a repulsive force in an atom, such complementarity creates brand-new materials that couldn’t stably exist in any other format.
Moore would come to regret this decision. As Michael Malone describes in The Intel Trinity, after a few years of chip advancements, digital watches ceased to be pieces of advanced technology and became part of the jewelry business. Unsurprisingly, Intel did not know much about jewelry. For years, Moore continued to wear his Intel-manufactured digital watch as a reminder to focus on the company’s strengths.
This eventually attracted the attention of the Harvard administration, which was displeased that Gates had used school resources for a commercial project and invited Allen, a non-student, to participate. Gates would later take advantage of Harvard’s generous leave policies to work full-time on Microsoft without technically dropping out. This pattern—use of school resources, administrative concerns, and a strategically timed leave—has obvious parallels with the founding of Facebook. One detail omitted from the film The Social Network, probably because it would have seemed unbelievable, is that
...more
Sporck’s devotion to cost-cutting was so extreme that at one point he decided to stop cutting the grass in front of National Semiconductor’s headquarters. An engineer brought a sheep to the office to keep the lawn under control.
The promise of being able to work on theoretical and applied research, with a large budget and the knowledge that the work would be put into practice at enormous scale, was enough to attract some of the best researchers.
Finally, policy drove R&D by making innovation investment a kind of tax haven. During the Great Depression, corporate taxes shot up and the US government imposed an undistributed-profits tax. Both policy changes discouraged companies from retaining earnings and reinvesting them. But since research expenses were a cost in accounting terms, businesses could invest their money there, untaxed. This became a way for corporations to “bank” their profits, converting today’s highly taxable gains into future profits that would, executives hoped, be realized at a lower tax rate.
Finally, much of the research was buried inside large, blue-chip companies—the sorts of companies that merit a business book only when they make a major misstep.
Following a bubble-like trajectory, executives set a goal as arbitrary as it was ambitious. In 1910, management decided AT&T would be able to offer cross-country phone calls before the 1915 World’s Fair in San Francisco. 244 AT&T’s engineers did not know how to achieve what was being asked of them. With the help of newly hired physicists, they set about doing basic research. In other words, the goal and the deadline predated the means and the risk of failure was high. Ultimately, the AT&T team discovered they could use a type of vacuum tube called a triode to amplify signals over longer
...more
On the other hand, because AT&T was doing research that was broadly useful, the company managed to gain an exemption under the Willis-Graham Act’s antitrust regulations, allowing it to swallow up most of the rest of the industry. As part of the deal, AT&T agreed to keep developing new technologies that would improve the telecommunications service. These inventions could not be business centers of their own.
This worked out well, if not perfectly, for all parties involved. Since management’s financial upside was capped, executives had a greater direct interest in the status of their work. A stable company that produced world-changing inventions was a prestigious place to work, even if the financial benefits of those inventions were captured by less constrained companies.
Over time, Bell Labs remained a great place for dreamers and tinkerers but an increasingly frustrating one for doers. As it turned out, some theorists were interested in seeing their products get built and sold (especially if it involved stock options).
While Skunk Works frowned on useless bureaucracy and box-checking, it didn’t eliminate rules. “There must be a minimum number of reports required,” Johnson emphasized, “but important work must be recorded thoroughly.” 248 And “because only a few people will be used in engineering and most other areas, ways must be provided to reward good performance by pay not based on the number of personnel supervised,” he insisted. 249 These meta rules, which were strictly enforced, guarded against the unchecked proliferation of more specific mandates.
Skunk Works succeeded where private–public partnerships often don’t. One reason is that the company and the US government shared common goals, notably surviving the Cold War. From a cynical shareholder perspective, losing a war is also bad for dividends.
The ideal position for a defense contractor is to prolong a conflict without resolving it; many of Lockheed’s finest productions did exactly that, such as developing weapons and surveillance technologies to counter Russian weapons or give the US more data about Soviet plans. These tools made outright conflict less likely but also stimulated demand for upgrades by forcing changes in Soviet strategy that would have to be countered anew.
Soon after, the term worked its way into language, both in common usage—as a verb meaning “to copy”—and in financial nomenclature, as investors throughout the 1960s tried to find “the next Xerox.”
PARC benefited from a glut of talent—a direct consequence of the winding down of the Apollo program, which led to job cuts at aerospace companies and a tougher job market for PhDs in hard-science fields. This provided PARC with access to a bumper crop of smart scientists. Meanwhile, Xerox’s continued profitability gave the group the budget necessary to explore ideas that might have long-term payoffs.
Neither the Alto nor the Ethernet protocol were commercial successes—at least not for Xerox. Ethernet developer Robert Metcalfe left PARC to found 3COM, which commercialized the protocol. Once Ethernet was being promoted by a small independent company rather than by the titanic Xerox, other big firms were willing to adopt it.
At one point in the early ’80s, Steve Jobs accused Microsoft of misusing Apple’s intellectual property. Bill Gates replied, “Well, Steve, I think there’s more than one way of looking at it. I think it’s more like we both had this rich neighbor named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.”
Instead of slackening, though, antitrust enforcement strengthened in the 1970s. Already-strict rules governing new acquisitions started to be applied to companies like IBM and AT&T, which had grown organically. In other words, the bargain went only one way. At the same time, a downshift in military spending and the space program eliminated a big, price-insensitive customer for new products. Completing the trifecta of bad news, higher oil prices damaged hydrocarbon-reliant companies like Du Pont and pressed inflation upward.
High inflation ushered in an era of higher interest rates. As a result, future profits were less important relative to immediate cost cutting. Since a company’s theoretical value is based on its future earnings discounted back to the present, an increase in the discount rate means profits from the distant future are less meaningful. Meanwhile, a premium is placed on immediate earnings.
In the 1960s, a research-intensive company might have traded at 30 times its earnings or higher, meaning investors were primarily betting on future growth. By the late 1970s, the average large stock in the US traded at just seven times earnings, implying a very low value for distant, hypothetical profits...
This highlight has been truncated due to consecutive passage length restrictions.
Many big tech companies including Meta, Google parent Alphabet, Snap, Shopify, Lyft, Pinterest, and Zoom use dual-class stock, which grants founders disproportionate voting power. This makes it hard for investors to fire a bad CEO, but it also makes it difficult for them to lay off a good one.
Google offers an excellent analytics package (which makes it easier for sites to measure the return on advertising through, to take a hypothetical example, lots of Google search ads); Microsoft incorporates AI into search in part because its low market share means that the cost burden of AI is lower, while the high fixed costs of search mean that it’s sensible for them to buy market share even with expensive features; Meta supports open-source projects like React, PyTorch, and the LLaMA family of large language models, knowing that they’re likely to be the single biggest user of these products
...more
As discussed in Chapter 1, the abundance of money in big tech over the last decade has coincided with a scarcity of ideas, as indicated by the billions of dollars the large tech firms have been amassing. But beyond antitrust law and regulation, recent technological advances like the theoretical breakthrough of transformer-based AI and the user-facing advancements represented by OpenAI’s ChatGPT and other large language model-based tools might be the most significant accelerants for another corporate R&D bubble.
And they’re not just building their own tools but also partnering with or acquiring AI companies. (Antitrust is less complicated when a business is being bought for its potential to create new markets rather
It’s important to note that raising corporate taxes does not always encourage R&D. All else being equal, the two forces should cancel each other out—companies can defer profits and save money on taxes, but when they realize those profits, they still pay taxes. Tax rates only operate as an incentive to invest in R&D today if there’s a belief that taxes will be lower tomorrow.
It’s notable that many of these subsidized products also compete, directly or indirectly, with profitable products sold by competitors. At small companies, competitors are a second-order concern; making customers happy is what matters most. But as market share jells and network effects kick in, it can make sense for companies to subsidize things in part because they inflict pain on competitors. Google can devote less management attention to improving its free office software suite if search is being threatened, for example.
The race to discover and exploit new sources of oil was perhaps the last gasp of the Victorian colonial adventure story. Small teams of geologists and explorers trekked across desert wastelands, dense jungles, and other difficult terrain, cutting deals with tribal chiefs, kings, and tsars. One explorer in what is now Iran realized that the best places to look for oil were in locations where locals worshiped fire, as a spiritually significant eternal flame might indicate a geologically significant deposit of natural gas. 261
These early days were characterized by extreme boom and bust cycles. Oil went for $16 a barrel in 1859, under 50¢ in 1861, and $8 in 1864. Fortunes were made and lost in months. Meanwhile,
Decentralized entrepreneurial exploration meant that any time drilling for oil became unusually profitable, new competitors would spring up, fueling a speculative bubble. In contrast, the refining portion of the industry was consolidated, which meant its future was being directed according to a coherent plan. Part of that plan was to end oil’s extreme price volatility.
Early on, Standard Oil had benefited from both the booms (through high profits) and the busts (by deploying the money it had conservatively saved to acquire less stable competitors). But as the industry grew, it was in Standard Oil’s interest to minimize volatility so consumers would feel confident their affordable kerosene lamps wouldn’t become wildly expensive. 263 Standard Oil gradually reduced its reliance on market prices for its oil purchases. In 1895 it issued a memo to oil producers announcing that henceforth it would ignore the market price entirely and transact directly with them at
...more
By 2001, Mitchell Energy & Development had attracted the attention of the oil and gas industry. Its output was somehow growing even as costs fell. The usual dynamic in energy is that companies target the most cost-effective projects first; they can grow, but only by drilling in places where the breakeven cost is higher. 272 But a key element of the economics of fracking is that as companies scale up, their marginal cost goes down as they get more efficient at fracking in general and at understanding the dynamics of particular areas in which they’re active. This is precisely what happened at
...more
A product that’s twice as expensive as its identical competitor isn’t viable, but a product that’s twice as expensive and getting 10 percent cheaper every year will eventually dominate the market.
An old joke, sometimes attributed to Mark Twain, is that a mine is “a hole in the ground with a liar on top.”
Given this history, energy investors are primed to be suspicious. As a result, energy entrepreneurship attracts good storytellers. For conventional energy companies, each attempt to drill has a cost, and companies sometimes need to raise funds after a series of dry holes. While they could raise more capital up front, doing so would mean giving away more of the upside if they were to get lucky early. The companies that survive this dynamic are often those that operate in a chronically under-capitalized way, and are therefore good at convincing investors to take the risk and fund one last
...more