More on this book
Community
Kindle Notes & Highlights
In 1945, Bush drew on the lessons of the war to draft a blockbuster report on the future of American innovation titled “Science, the Endless Frontier.” The most important idea that emerged from the Bush report was the primacy of “basic research”—a term Bush meant to refer to science at universities and research centers that seeks to understand the world “without thought of practical ends.”
Rather than rely on private philanthropy, or the closed-door laboratories of corporate behemoths, Bush saw the future of science as a kind of hub-and-spoke system, with the federal government directing funds to the most deserving university researchers.
If these institutions had never been created or expanded, the lives of millions, even billions of people around the world would be shorter than they are today, and people would be sicker. If they disappeared tomorrow, the world would instantly be worse.
The first problem that emerged with the rise of the NIH echoes the criticism from our chapters on housing, energy, and the difficulty of building things in America: rules have increased, while efficiency has decreased.
A decade later, Senator William Proxmire, a Wisconsin Democrat, created the Golden Fleece Award to draw negative attention to the worst use of government money in science.
The instinct to make science democratically responsible has gunked up the scientific process.
To appreciate the explosion of scientific paperwork requirements, imagine if every scientist working in America contracted a chronic fatigue disorder that made it impossible for them to work for half of the year.
Universities have whole floors whose main job is to administer these NIH grants.
The rules exist for a reason, Doench acknowledged. Some scientists in the past probably abused their funding.
The second problem coming out of the growth of the NIH is that the onerous process of applying for grants has put a premium on status-seeking rather than pure science.
You needed the kind of interpersonal savvy that got you invited to speak at conferences or made people eager to mentor and support you.
“There is a hidden curriculum for navigating grants, and it is critical for success as a scientist today,” Azoulay said.
While many discoveries depend on high-risk research that departs from the herd—like embracing the potential of mRNA while others rush toward DNA—modern science too often plays it safe.
“We find that evaluators uniformly and systematically give lower scores to proposals with increasing novelty,” the team concluded.
Bias against novelty, risk, and edgy thinking is a tragedy, because the most important breakthroughs in scientific history are often wild surprises that emerge from bizarre obsessions.
The most famous pharmaceutical breakthrough of the last decade is thus built on the foundation of a most delightfully peculiar obsession: lizard spit. Science is often nonlinear in this way.
But this is how science often works; a broad base of knowledge is built, upon which we piece together disparate fragments of a puzzle to create new breakthroughs.
Isaac Newton famously said he saw further by standing “on the shoulders of giants.” But clearly, some brilliant ideas are not born giants. They are born as all children are born—small and helpless, requiring care and protection to grow.
In a strange way, the problem isn’t that too much science is “doomed to fail,” he said. It’s the opposite. Too much science is, in his words, “doomed to succeed”—fated to duplicate what we know rather than risk failure by reaching into the unknown.
In 2017, longtime NIH director Francis Collins acknowledged, in an email to the libertarian venture capitalist Peter Thiel, that NIH needed “to liberate young scientists from training periods that are much too long” and that “some of the ways in which we support” biomedical research are “outdated.”
In 1958, vowing that the US should never again be on the other side of a technological surprise, the Department of Defense established the Advanced Research Projects Agency.
With an annual budget of $4 billion94—about one-tenth of the NIH—DARPA punches well above its weight. One answer is that DARPA empowers domain experts called program managers to pay scientists and technologists to work together on projects of their own design.
They can make big counterintuitive bets, are not punished for failure, and are not hauled before congressional committees for supporting weird-sounding projects.
When ARPANET went online in 1969, the world’s original—and very basic—internet required the collaboration of individuals and firms who would never have otherwise come together. To invent an online network of information, Licklider and Taylor built an offline network of minds.
The announcement was IBM’s. But the breakthrough itself began with DARPA.
The American innovation system would benefit from trusting individuals more and bureaucracies less.
In 1947, its engineers built the first transistor, which enabled the development of smaller and more efficient electronic devices.
As a state-sanctioned monopoly, AT&T could invest in every facet of telecommunications science without concern for short-term profits, which gave its scientists and engineers the freedom to pursue ambitious projects over decades.
After World War II, AT&T was a goliath within a goliath—a huge government-sanctioned monopoly inside a country that dominated fields like chemistry and quantum mechanics when Nazi Germany’s assault on Europe forced many of the continent’s best minds to flee to America.
“If Bell Labs had a formula, it was to hire the smartest people, give them space and time to work, and make sure that they talk to each other,” Gertner said.
The journal entry struck Gertner as a microcosm of Bell Labs’s unusual approach to science. Here was a chemist, tinkering with the fundamental principles of electrons, thinking about how his invention would become a product that went through factory assembly and ended up in people’s houses.
The Bell Labs scientists worked in an offshoot of AT&T, which made it natural for them to consider the commercial potential of their work, which might explain how they created so many useful products.
We could fix the manufactured scarcities of our immigration system and make it easier for the world’s most brilliant people—who often graduate from American schools—to stay and work in the US.
But that’s what high-risk science does: it takes on projects with a keen possibility of failure.
If these things are possible in the realm of physical reality, then they are possible to discover; and if they can be discovered in a century, they can be discovered in a decade, or in a year. These achievements will require a level of risk-taking and ambition that we are too effective at snuffing out.
For many, progress appears to be a mere timeline of such eureka moments. Our mythology of invention treats the moment of discovery as a sacred scene.
You can think of this as the “eureka theory of history.” It’s the story of progress you might expect to see in Hollywood or to read in nonfiction books that hail the lonely hero whose flash of insight changes the world. But this approach to history is worse than incomplete; it’s downright wrong. Inventions do matter greatly to progress. But too often, when we isolate these famous scenes, we leave out the most important chapters of the story—the ones that follow the initial lightning bolt of discovery.
Thirteen years after one of the most famous discoveries in science history, penicillin had accomplished practically nothing.
But progress is more about implementation than it is about invention. An idea going from nonexistence to existence—from zero to one—introduces the possibility of change. But the way individuals, companies, and governments take an idea from one to one billion is the story of how the world actually changes. And it doesn’t always change, even after a truly brilliant discovery. The ten-thousand-year story of human civilization is mostly the story of things not getting better: diseases not being cured, freedoms not being extended, truths not being transmitted, technology not delivering on its
...more
To borrow some familiar language, it’s not just that ideas are getting harder to find. The problem is also that new ideas are getting harder to use.
Burned by regulations and inattention to cost-effective production, basic elevators cost four times more in New York City than in Switzerland. Americans invented the world’s first nuclear reactor and solar cell. But today, we’re well behind various European and Asian countries in deploying and developing these technologies.
Innovation can make impossible problems possible to solve, and policy can make impossible technologies possible to create.
Instead, we are stuck between a progressive movement that is too afraid of growth and a conservative movement that is allergic to government intervention.
By March 1945, there was enough penicillin for just about everyone in America.
The lesson, which the US seems to have forgotten in the last few decades, is that implementation, not mere invention, determines the pace of progress. In 1941, penicillin was a stalled science project, languishing in the resource-starved labs of warring Europe. It became a lifesaving product only thanks to hundreds of American scientists and engineers. Almost every technology is like this. “Most major inventions initially don’t work very well,” the economic historian Joel Mokyr said. “They have to be tweaked, the way the steam engine was tinkered with by many engineers over decades. They have
...more
Edison did not make electric light possible. But his microinventions did something even more important. Through exhaustive tinkering, embodying, and scaling, he made electric light useful.
After World War II, the American approach to innovation has been to throw money at the initial eureka moment, sporadically support its development, and then watch idly as the technological frontier moves to other countries.
Thousands of scientists, engineers, and entrepreneurs flocked to the field, sensing the dawn of an energy revolution.
Some of the dismantling was painfully literal: in 1986, Reagan removed the solar hot-water panels installed on the White House roof by Jimmy Carter.
Without sufficient oil and gas resources to power a billion-person economy, China has had existential motivation to develop its own domestic energy technology.