More on this book
Community
Kindle Notes & Highlights
Read between
January 7 - February 6, 2025
artificial intelligence (AI) and synthetic biology.
our future both depends on these technologies and is imperiled by them.
AI has been climbing the ladder of cognitive abilities for decades, and it now looks set to reach human-level performance across a very wide range of tasks within the next three years.
quixotic
Moreover, attempting to ban development of new technologies is itself a risk: technologically stagnant societies are historically unstable and prone to collapse. Eventually, they lose the capacity to solve problems, to progress.
This is the core dilemma: that, sooner or later, a powerful generation of technology leads humanity toward either catastrophic or dystopian outcomes. I believe this is the great meta-problem of the twenty-first century.
They finished with an alarming thought: a single person today likely “has the capacity to kill a billion people.” All it takes is motivation.
pessimism-aversion trap: the misguided analysis that arises when you are overwhelmed by a fear of confronting potentially dark realities, and the resulting tendency to look the other way.
Pessimism aversion is an emotional response, an ingrained gut refusal to accept the possibility of seriously destabilizing outcomes. It tends to come from those in secure and powerful positions with entrenched worldviews, people who can superficially cope with change but struggle to accept any real challenge to their world order.
without containment, every other aspect of technology, every discussion of its ethical shortcomings, or the benefits it could bring, is inconsequential.
they are inherently general and therefore omni-use, they hyper-evolve, they have asymmetric impacts, and, in some respects, they are increasingly autonomous.
Their creation is driven by powerful incentives: geopolitical competition, massive financial rewards, and an open, distributed culture of research. Scores of state and non-state actors will race ahead to develop them regardless of efforts to regulate and control what’s coming, taking risks that affect everyone, whether we like it or not.
wave is a set of technologies coming together around the same time, powered by one or several new general-purpose technologies with profound societal implications.
For the futurist Alvin Toffler, the information technology revolution was a “third wave” in human society following the Agricultural and Industrial revolutions.
Carlota Perez has talked about “techno-economic paradigms” rapidly shifting amid technological revolutions.
Proliferation is catalyzed by two forces: demand and the resulting cost decreases, each of which drives technology to become even better and cheaper.
As you get more and cheaper technology, it enables new and cheaper technologies downstream.
It created a yet more mind-boggling proliferation: data, up twenty times in the decade 2010–2020 alone.
Until recently, the history of technology could be encapsulated in a single phrase: humanity’s quest to manipulate atoms.
information is a core property of the universe.
The coming wave of technology is built primarily on two general-purpose technologies capable of operating at the grandest and most granular levels alike: artificial intelligence and synthetic biology.
Technology is hence like a language or chemistry: not a set of independent entities and practices, but a commingling set of parts to combine and recombine.
about closely tracking the development of multiple exponential curves over decades, projecting them into the future, and asking what that means.
AI.
Deep learning uses neural networks loosely modeled on those of the human brain.
backpropagation
this remarkable technique, long derided in the field, cracked computer vision and took the AI world by storm.
In 2012, AlexNet beat the previous winner by 10 percent.
Computer vision is the basis of Amazon’s checkout-less supermarkets and is present in Tesla’s cars, pushing them toward increasing autonomy.
In 1987 there were just ninety academic papers published at Neural Information Processing Systems, at what became the field’s leading conference. By the 2020s there were almost two thousand. In the last six years there was a six-fold increase in the number of papers published on deep learning alone,
Their sensory systems will be as good as ours. This does not equate to superintelligence
developed systems to control billion-dollar data centers, a project resulting in 40 percent reductions in energy used for cooling.
AI really isn’t “emerging” anymore. It’s in products, services, and devices you use every day.
A big part of what makes humans intelligent is that we look at the past to predict what might happen in the future. In this sense intelligence can be understood as the ability to generate a range of plausible scenarios about how the world around you may unfold and then base sensible actions on those predictions.
LLMs take advantage of the fact that language data comes in a sequential order. Each unit of information is in some way related to data earlier in a series.
In AI this notion is commonly referred to as “attention.”
the model doesn’t use our vocabulary. Instead, it creates a new vocabulary of common tokens that helps it spot patterns across billions and billions of documents.
(GPT stands for generative pre-trained transformer.)
It appears to “understand” spatial and causal reasoning, medicine, law, and human psychology.
In 1996, thirty-six million people used the internet; this year it will be well over five billion. That’s the kind of trajectory we should expect for these tools, only much faster.
“brain-scale” models with many trillions of parameters.
Inflection AI, my new company, today use around five billion times more compute than the DQN games-playing AI that produced those magical moments on Atari games at DeepMind a decade ago.
in less than ten years the amount of compute used to train the best AI models has increased by nine orders of magnitude—going from two petaFLOPs to ten billion petaFLOPs.
Sometimes people seem to suggest that in aiming to replicate human-level intelligence, AI chases a moving target or that there is always some ineffable component forever out of reach. That’s just not the case.
The human brain is said to contain around 100 billion neurons with 100 trillion connections between them—it is often said to be the most complex known object in the universe.
EleutherAI, a grassroots coalition of independent researchers, has made a series of large language models completely open-source, readily available to hundreds of thousands of users.
What started with language has become the burgeoning field of generative AI.
A fully open-source model called Stable Diffusion lets anyone produce bespoke and ultrarealistic images, for free, on a laptop.
55 percent faster at completing coding tasks,