More on this book
Community
Kindle Notes & Highlights
Read between
December 22 - December 24, 2023
An incredibly compelling window into the current developments and exponential future of AI—from the ultimate insider…If you really want to understand how society can safely navigate this world-changing technology, read this book.”
This is the core dilemma: that, sooner or later, a powerful generation of technology leads humanity toward either catastrophic or dystopian outcomes. I believe this is the great meta-problem of the twenty-first century. This book outlines exactly why this terrible bind is becoming inevitable and explores how we might confront it.
Right now none of the technologies described in this chapter are even close to their full potential.
Early pioneers in the 1950s predicted that it would take about a decade to develop. Like so many of the technologies described here, that was a significant underestimation.
Not everyone agrees these technologies are either as locked on or as consequential as I think they are. Skepticism and pessimism aversion are not unreasonable responses, given there is much uncertainty.
Combating attacks is difficult and expensive; both Americans and Israelis use $3 million Patriot missiles to shoot down drones worth a couple hundred dollars.
Engineers can’t peer beneath the hood and easily explain what caused something to happen. GPT-4, AlphaGo, and the rest are black boxes, their outputs and decisions based on opaque and intricate chains of minute signals.
Technological rivalry is a geopolitical reality. Indeed it always has been. Nations feel the existential need to keep up with their peers. Innovation is power.
Science and technology live and breathe on free debate and the open sharing of information, to the extent that openness has itself grown into a powerful (and amazingly beneficial) incentive.
fossil fuel use is greatly up despite all the moves into clean electricity as a power source.
my experience with local government, UN negotiations, and nonprofits also gave me invaluable firsthand knowledge of their limitations. They are often chronically mismanaged, bloated, and slow to act.
Without trust, from the ballot box to the tax return, from the local council to the judiciary, societies are in trouble.
A Democracy Perception Index poll found that across fifty nations two-thirds of respondents felt the government “rarely” or “never” acted in the public interest.
WannaCry was built using technology created by the U.S. National Security Agency (NSA). An elite NSA unit called the Office of Tailored Access Operations had developed a cyberattack exploit called EternalBlue.
In the face of an abundance of ultra-low-cost equivalents, the days of this kind of “cognitive manual labor” are numbered.
We absolutely need to be able to verify, at every level, the safety, integrity, or uncompromised nature of a system. That in turn is about access rights and audit capacity, about adversarially testing systems, having teams of white hat hackers or even AIs probing weaknesses, flaws, and biases. It’s about building technology in an entirely different way, with tools and techniques that don’t exist yet. External scrutiny is essential.