More on this book
Community
Kindle Notes & Highlights
managing the coming wave requires confident, agile, coherent states, accountable to the people, filled with expertise, balancing interests and incentives, capable of reacting fast and decisively with legislative...
This highlight has been truncated due to consecutive passage length restrictions.
Cheap, omnipresent robots like those sketched above are, alongside a host of other transformative technologies we saw in part 2, utterly inevitable over a twenty-year horizon, and possibly much sooner.
The grand bargain is already in trouble. As the deluge begins, a series of new stressors will shake its foundations.
WannaCry is where it came from. WannaCry was built using technology created by the U.S. National Security Agency (NSA). An elite NSA unit called the Office of Tailored Access Operations had developed a cyberattack exploit called EternalBlue. In the words of one NSA staffer these were “the keys to the kingdom,” tools designed to “undermine
the security of a lot of major government and corporate networks both here and abroad.”
hackers who stole the technology, a group known as the Shadow Brokers, put EternalBlue up for sale. From there it soon ended up in the hands of North Korean hackers, probably the state-sponsored Bureau 121 cyber unit. They then launched it on the world.
The NotPetya cyberattack almost brought the country to its knees. Radiation monitoring systems at Chernobyl lost power. ATMs stopped dispensing money. Mobile phones went silent. Ten percent of the country’s computers were infected, and basic infrastructure from the electrical grid to the Ukrainian State Savings Bank went down. Major multinationals like the shipping giant Maersk were immobilized, collateral damage.
Such attacks demonstrate that there are those who would use cutting-edge technologies to degrade and disable key state functions. They show that core institutions of modern life are vulnerable.
A lone individual and a private company (Microsoft) patched up the systemic weakness. This attack did not respect national boundaries. Government’s role in handling the crisis was limited.
Now imagine if, instead of accidentally leaving open a loophole, the hackers behind WannaCry had designed the program to systematically learn about its own vulnerabilities and repeatedly patch them. Imagine if, as it attacked, the program evolved ...
This highlight has been truncated due to consecutive passage length restrictions.
through every hospital, every office, every home, constantly mutating, learning. It could hit life-support systems, military infrastructure, transport signaling, the energy grid, financial databases. As it spread, imagine the program learning to detect and stop further attempts to shut ...
This highlight has been truncated due to consecutive passage length restrictions.
Today’s cyberattacks are not the real threat; they are the canary in the coal mine of a new age of vulnerability and instability, degrading the nation-state’s role as the sole arbiter of security.
Here is a specific, near-term application of next-wave technology fraying the state’s fabric.
These fragility amplifiers, system
shocks, emergencies 2.0, will greatly exacerbate existing challenges, shaking the state’s foundation, upsetting our already precarious social balance. This is, in part, a story of who can do what, a story of power and where it lies.
Power is “the ability or capacity to do something or act in a particular way;…to direct or influence the behavior of others or the course of events.”
Technology is ultimately political because technology is a form of power. And perhaps the single overriding characteristic of the coming wave is that it will democratize access to power. As we saw in part 2, it will enable people to do things in the real world.
Wherever power is today, it will be amplified.
Anyone with goals—that is, everyone—will have huge help in realizing them. Overhauling a business strategy, putting on social events for a local community, or capturing enemy territory all get easier. Building an airline or grounding a fleet are both more achievable. Whether it’s commercial, religious,
Today, no matter how wealthy you are, you simply cannot buy a more powerful smartphone than is available to billions of people. This phenomenal achievement of civilization is too often overlooked.
Fakhrizadeh’s assassination is a harbinger of what’s to come.
Atlas and BigDog, are easy to find on the internet. Here you’ll see stocky, strange-looking humanoids and small doglike robots scamper over obstacle courses.
Now imagine robots equipped with facial recognition, DNA sequencing, and automatic weapons. Future robots may not take the form of scampering dogs. Miniaturized even further, they will be the size of a bird or a bee, armed with a small firearm or a vial of anthrax. They might soon be accessible to anyone who wants them. This is what bad actor empowerment looks like.
By 2028, $26 billion a year will be spent on military drones, and at that point many are likely to be fully autonomous.
Start-ups like Anduril, Shield AI, and Rebellion Defense have raised hundreds of millions of dollars to build autonomous drone networks
Complementary technologies like 3-D printing and advanced mobile communications will reduce the cost of tactical drones to a few thousand dollars, putting them within reach of everyone from amateur enthusiasts to paramilitaries to lone psychopaths.
As the cybersecurity expert Bruce Schneier has pointed out, AIs could digest the world’s laws and regulations to find exploits, arbitraging legalities.
AI adept at exploiting not just financial, legal, or communications systems but also human psychology, our weaknesses and biases, is on the way.
the complex board game Diplomacy,
AIs could help us plan and collaborate, but also hints at how they could develop psychological tricks to gain trust and influence, reading and manipulating our emotions and behaviors with a frightening level of depth, a skill useful in, say, winning at Diplomacy or electioneering and building a political movement.
Now powerful, asymmetric, omni-use technologies are certain to reach the hands of those who want to damage the state.
Unlike an arrow or even a hypersonic missile, AI and bioagents will evolve more cheaply, more rapidly, and more autonomously than any technology we’ve ever seen.
Consequently, without a dramatic set of interventions to alter the current course, millions will have access to these capabilities in just a few years.
A world of deepfakes indistinguishable from conventional media is here. These fakes will be so good
our rational minds will find it hard to accept they aren’t real.
Sermons from the radical preacher Anwar al-Awlaki inspired the Boston Marathon bombers, the attackers of Charlie Hebdo in Paris, and the shooter who killed forty-nine people at an Orlando nightclub. Yet al-Awlaki died in 2011, the first U.S. citizen killed by a U.S. drone strike, before any of these events.
The rise of synthetic media at scale and minimal cost amplifies both disinformation (malicious and intentionally misleading information) and misinformation (a wider
and more unintentional pollution of the information space) at once. Cue an “Infocalypse,” the point at which society can no longer manage a torrent of sketchy material, where the information ecosystem grounding knowledge, trust, and social cohesion, the glue holding society together, falls apart.
The 1977 Russian flu is just one example. Just two years later anthrax spores were accidentally released from a secret Soviet bioweapons facility, producing a fifty-kilometer trail of disease that killed at least sixty-six people.
In 2021, a pharmaceutical company researcher near Philadelphia left smallpox vials in an unmarked, unsecured freezer. Luckily, they were found by someone cleaning the freezer. The person was lucky to be wearing a mask and gloves. Had it got out, the consequences would have been catastrophic.
number of BSL-4 labs booms, only a quarter
Consider that throughout history, tools and technologies have been designed to help us do more with less. Each individual
instance counts for almost nothing.
But what happens if the ultimate side effect of these compounding efficiencies is that humans ar...
This highlight has been truncated due to consecutive passage length restrictions.
But what if new job-displacing systems scale the ladder of human cognitive ability itself, leaving nowhere new for labor to turn?
In few areas will humans still be “better” than machines. I have long argued this is the more likely scenario. With the arrival of the latest generation of large language models, I am now more convinced than ever that this is how things will play out.
Early analysis of ChatGPT suggests it boosts the productivity of “mid-level college educated professionals” by 40 percent on many tasks. That in turn could affect hiring decisions: a McKinsey study estimated that more than half of all jobs could see many of their tasks automated by machines in the next seven years, while fifty-two million Americans work in roles with a
believe this rosy vision is implausible over the next couple of decades; automation is unequivocally another fragility amplifier. As we saw in chapter 4, AI’s rate
The Private Sector Job Quality Index, a measure of how many jobs provide above-average income, has plunged since 1990;
it suggests that well-paying jobs as a proportion of the total have already started to fall.

