More on this book
Community
Kindle Notes & Highlights
by
Max Tegmark
Read between
December 4, 2024 - August 18, 2025
the petite 10-16 kg bacterium Pelagibacter, believed to account for more biomass than all the world’s fish combined.
Whereas you or a future Earth-sized supercomputer can have many thoughts per second, a galaxy-sized mind could have only one thought every hundred thousand years,
My guess is that in a cosmos teeming with superintelligence, almost the only commodity worth shipping long distances will be information.
Subrahmanyan Chandrasekhar
Chandrasekhar limit,
fecund
immutable,
An ambitious civilization can thus encounter three kinds of regions: uninhabited ones, life bubbles and death bubbles.
our Universe, whose radius is about 1026 meters,
the Tsar Bomba, the most powerful hydrogen bomb ever built.
eschewing
squander
sphalerons or black holes),
The collision of two expanding civilizations may result in assimilation, cooperation or war,
slowpoke civilizations get severely penalized, with one that expands 10 times slower ultimately settling 1,000 times fewer galaxies.
John Gribbin
2011 book Alone in the Universe.
Paul Davies’ 2011 book The Ee...
This highlight has been truncated due to consecutive passage length restrictions.
The mystery of human existence lies not in just staying alive, but in finding something to live for. Fyodor Dostoyevsky, The Brothers Karamazov
How did such goal-oriented behavior emerge from the physics of our early Universe,
Fermat’s principle, articulated in 1662,
out of all ways that nature could choose to do something, it prefers the optimal way,
global messiness (entropy).
dissipation-driven adaptation,
dissipation” means causing entropy to increase,
1944 book What’s Life? by Erwin Schrödinger,
vortices in turbulent fluids can make copies of themselves,
clusters of microspheres can coax nearby spheres into forming identical clusters.
At some point, a particular arrangement of particles got so good at copying itself that it could do so almost indefinitely by extracting energy and raw materials from its enviro...
This highlight has been truncated due to consecutive passage length restrictions.
if you start with one and double just three hundred times, you get a quantity exceeding the number of particles in our Universe.
the goal change from dissipation to replication
mastication.
replication aids dissipation,
denizens
Herbert Simon
“bounded rationality”
because they have limited resources: the rationality of their decisions is limited by th...
This highlight has been truncated due to consecutive passage length restrictions.
Evolution has implemented replication optimization in precisely this way: rather than ask in every situation which action will maximize an organism’s number of successful offspring, it implements a hodgepodge of heuristic hacks: rules of thumb that usually work well.
a living organism is an agent of bounded rationality that doesn’t pursue a single goal, but instead follows rules of thumb for what to pursue and avoid. Our human minds perceive these evolved rules of thumb as feelings, which usually (and often without us being aware of it) guide our decision making toward the ultimate goal of replication.
William James
António Damásio.
Why do we sometimes choose to rebel against our genes and their replication goal? We rebel because by design, as agents of bounded rationality, we’re loyal only to our feelings.
different people take it to mean different things,
So far, most of what we build exhibits only goal-oriented design, not goal-oriented behavior: a highway doesn’t behave; it merely sits there.
Teleology is the explanation of things in terms of their purposes rather than their causes,
Goal-Oriented Entities Billions of Tons 5 × 1030 bacteria 400 Plants 400 1015 mesophelagic fish 10 1.3 × 109 cows 0.5 7 × 109 humans 0.4 1014 ants 0.3 1.7 × 106 whales 0.0005 Concrete 100 Steel 20 Asphalt 15 1.2 × 109 cars 2 Table 7.1: Approximate amounts of matter on Earth in entities that are evolved or designed for a goal. Engineered entities such as buildings, roads and cars appear on track to overtake evolved entities such as plants and animals.
All machines are agents with bounded rationality, and
the real risk with AGI isn’t malice but competence.
A superintelligent AI will be extremely good at accomplishing its goals, and if those goals aren’t aligned with ours, we’re in trouble.
“friendly AI”: AI whose goals are aligned with ours.3

