More on this book
Community
Kindle Notes & Highlights
Read between
January 5 - January 12, 2024
With AI, we could create systems that are beyond our control and find ourselves at the mercy of algorithms that we don’t understand. With biotechnology, we could manipulate the very building blocks of life, potentially creating unintended consequences for both individuals and entire ecosystems.
choice—a choice between a future of unparalleled possibility and a future of unimaginable peril. The fate of humanity hangs in the balance, and the decisions we make in the coming years and decades will determine whether we rise to the challenge of these technologies
Almost every culture has a flood myth.
Almost every foundational technology ever invented, from pickaxes to plows, pottery to photography, phones to planes, and everything in between, follows a single, seemingly immutable law: it gets cheaper
and easier to use, and ultimately it proliferates, far and wide.
This ecosystem of invention defaults to expansion. It is the inherent nature of technology.
Further progress in one area accelerates the others in a chaotic and cross-catalyzing process beyond anyone’s direct control.
And yet alongside these benefits, AI, synthetic biology, and other advanced forms of technology produce tail risks on a deeply concerning scale.
The presenter showed how the price of DNA synthesizers, which can
print bespoke strands of DNA, was falling rapidly. Costing a few tens of thousands of dollars, they are small enough to sit on a bench in your garage and let people synthesize—that is, manufacture—DNA. And all this is now possible for anyone with graduate-level training in biology or an enthusiasm for self-directed learning online.
the pessimism-aversion trap: the misguided analysis that arises when you are overwhelmed by a fear of confronting potentially dark
realities, and the resulting tendency to look the other way.
Pretty much everyone has some version of this reaction, and the consequence is that it’s leading us to overlook a number of critical t...
This highlight has been truncated due to consecutive passage length restrictions.
Pessimism aversion is an emotional response, an ingrained gut refusal to accept the possibility of seriously destabilizing outcomes.
At the dawn of the Agricultural Revolution the worldwide human population numbered just 2.4 million. At the start of the Industrial Revolution, it approached 1 billion, a four-hundred-fold
hundred-fold increase that was predicated on the waves of the intervening period.
Consider that children who grew up traveling
by horse and cart and burning wood for heat in the late nineteenth century spent their final days traveling by airplane and living in houses warmed by the splitting of the atom.
Our phones are the first thing we see in the morning and the last at night. Every aspect of human life is affected: they help us find love and new friends while turbocharging supply chains. They influence who gets elected and how, where our money is invested, our children’s self-esteem,
esteem, our music tastes, our fashion, our food, and everything in between.
It’s easy to get lost in the details, but step back and you
can see waves gathering speed, scope, accessibility, and consequence.
Once they gather momentum, they rarely stop. Mass diffusion, raw, rampant proliferation—this is technology’s historical default, ...
This highlight has been truncated due to consecutive passage length restrictions.
History tells us that technology diffuses, inevitably, eventually to almost everywhere, from the first campfires to the fires of the Saturn V rocket, from the first scrawled letters to the endless text of the internet.
Technology’s unavoidable challenge is that its makers quickly lose control over the path their inventions take once introduced to the world.
Technology exists in a complex, dynamic system (the real world), where second-, third-, and nth-order consequences ripple out unpredictably.
Understanding technology is, in part, about trying to understand its unintended consequences, to predict not just positive spillovers but “revenge effects.”
Technology’s problem here is a containment problem. If this aspect cannot be eliminated, it might be curtailed. Containment is the overarching ability to control, limit, and, if need be, close down technologies at any stage of their
development or deployment. It means, in some circumstances, the ability to stop a technology from proliferating in the first place, checking the ripple of unintended consequences (both good and bad).
Just because consequences are difficult to predict doesn’t mean we shouldn’t try. In most cases, containment is about meaningful control, the capability to stop a use case, change a research direction, or deny access to harmful actors. It means preserving the ability to steer waves to ensure their impact reflects our values, helps us flourish as a species,
and does not introduce significant harms that outweigh their benefits.
Containment encompasses regulation, better technical safety, new governance and ownership models, and new modes of accountability and transparency, all as necessary (but not sufficient) precursors to safer technology.
Containment shouldn’t be seen as the final answer to all technology’s problems; it is rather the first, critical step, a foundation on which the future is built.
Technical containment refers to what happens in a lab or an R&D facility. In AI, for example, it means air gaps, sandboxes, simulations, off switches, hard built-in safety and security measures—protocols for verifying the safety or integrity or uncompromised nature of a system and taking it offline if needed.
Technologies are ideas, and ideas cannot be eliminated.
People have often said no, desired contained technology for a plethora of reasons. It’s just never been enough. It’s not that the containment problem hasn’t been recognized in history; it’s just that it has never been solved.
After all, it was only in 2019 that U.S. command and control systems were upgraded from 1970s hardware and eight-inch floppy disks. The world’s most sophisticated and destructive weapons arsenal ran on technology so antiquated it would be unrecognizable (and unusable) to most people alive today.
Accidents are legion. In 1961, for example, a B-52 in the skies above North Carolina developed a fuel leak. The crew ejected from the ailing aircraft, leaving it and its payload to plummet to the ground. In the process, a live hydrogen bomb’s safety switch flicked to “armed” as it crashed into a field. Of its four safety mechanisms, just one was left in place, and an explosion was miraculously avoided. In 2003 the British Ministry of Defence disclosed more than 110 near misses and accidents in the history of its nuclear weapons program. Even the Kremlin, hardly a model of openness, has
...more
Plenty of nuclear material is unaccounted for, from hospitals, businesses, militaries, even recently from Chernobyl. In 2018, plutonium and cesium were stolen from a Department of Energy official’s car in San Antonio, Texas, while they slept in a nearby hotel. The nightmare scenario is a loose warhead, stolen in transit or even somehow missed in an accounting exercise. It may sound fanciful, but the United States has in fact lost at least three nuclear weapons.
Through trial and error, it learned to control the paddle, bounce the ball back and forth, and knock out bricks row by row. Impressive stuff. Then something remarkable happened. DQN appeared to discover a new, and very clever, strategy. Instead of simply knocking out bricks steadily, row by row, DQN began targeting a single column of bricks. The result was the creation of an efficient route up to the back of the block of bricks. DQN had tunneled all the way to the top, creating a path that then enabled the ball to simply bounce off the back wall, steadily destroying the entire set of bricks
like a frenzied ball in a pinball machine. The method earned the maximum score with minimum effort. It was an uncanny tactic, not unknown to serious gamers, but far from obvious. We had watched as the algorithm taught itself something new. I was stunned.
When IBM’s Deep Blue beat Garry Kasparov at chess in 1997, it used the so-called brute-force technique, where an algorithm aims to systematically crunch through as many possible moves as it can. That approach is hopeless in a game with as many branching outcomes as Go.
From fire to electricity, stone tools to machine tools, hydrocarbons to medicines, the journey described in chapter 2 is essentially a vast, unfolding process in which our species has
slowly extended its control over atoms. As this control has become more precise, technologies have steadily become more powerful and complex, giving rise to machine tools, electrical processes, heat engines, synthetic materials like plastics, and the creation of intricate molecules capable of defeating dreaded diseases. At root, the primary driver of all of these new technologies is material—the ever-growing manipulation of their atomic elements.
starting in the mid-twentieth century, technology began to operate at a higher level of abstraction. At the heart of this shift was the realization that information is a core property of the universe. It can be encoded in a binary format and is, in the form of DNA, at the core of how life operates. Strings of ones and zer0s, or the base pairs of DNA—these are not just mathematical curiosities. They are foundational and powerful. Understand and control these streams of information and you might stea...
This highlight has been truncated due to consecutive passage length restrictions.
The coming wave of technology is built primarily on two general-purpose technologies capable of operating at the grandest and most granular levels alike: artificial intelligence and synthetic biology.
“the overall collection of technologies bootstraps itself upward from the few to the many and from the simple to the complex.” Technology is hence like a language or chemistry: not a set of independent entities and practices, but a commingling set of parts to combine and recombine.
In the case of AlexNet, the training data consisted of images. Each red, green, or blue pixel is given a value, and the resulting array of numbers is fed into the network as an input. Within the network, “neurons” link to other neurons by a series of weighted
connections, each of which roughly corresponds to the strength of the relationship between inputs. Each layer in the neural network feeds its input down to the next layer, creating increasingly abstract representations.
AI is becoming much easier to access and use: tools and infrastructure like Meta’s PyTorch or OpenAI’s application programming interfaces (APIs) help put state-of-the-art machine learning capabilities in the hands of nonspecialists. 5G and ubiquitous connectivity create a massive, always-on user base.