More on this book
Community
Kindle Notes & Highlights
by
Max Tegmark
Read between
December 4, 2024 - August 18, 2025
Tommaso Toffoli
comput...
This highlight has been truncated due to consecutive passage length restrictions.
Stephen Wolfram
computation is substrate-independent in the same way that information is:
substrate independence doesn’t mean that a substrate is unnecessary, but that most of its details don’t matter.
the substrate-independent phenomenon takes on a life of its own, independent of its substrate.
it’s often only the substrate-independent aspect that we’re interested in:
In short, computation is a pattern in the spacetime arrangement of particles, and it’s not the particles but the pattern that really matters! Matter doesn’t matter.
In other words, the hardware is the matter and the software is the pattern. This substrate independence of computation implies that AI is possible: intelligence doesn’t require flesh, blood or carbon atoms.
Ray Kurzweil
“the law of accelerating returns.”
floating-point operations per second (FLOPS)
speck
Moore’s law involves not the first but the fifth technological paradigm to bring exponential growth in computing, as illustrated in figure 2.8: whenever one technology stopped improving, we replaced it with an even better one. When we could no longer keep shrinking our vacuum tubes, we replaced them with transistors and then integrated circuits, where electrons move around in two dimensions. When this technology reaches its limits, there are many other alternatives we can try—for
program counter,
The ultimate parallel computer is a quantum computer.
David Deutsch
The ability to learn is arguably the most fascinating aspect of general intelligence.
When we humans first created pocket calculators and chess programs, we did the arranging. For matter to learn, it must instead rearrange itself to get better and better at computing the desired function—simply
Neural networks have now transformed both biological and artificial intelligence,
machine learning (the study of algorithms that improve through experience).
it’s the strengths of these roughly hundred trillion synapse connections that encode most of the information in your brain.
activation function
feedforward,
neural networks are universal in the sense that they can compute any function arbitrarily accurately, by simply adjusting those synapse strength numbers accordingly.
Henry Lin.
Donald Hebb
nearby neurons
“Fire together, wire together.”
Hebbian learning)
information can flow in multiple directions rather than just one way, so that the current output can become input to what happens next.
society can itself be viewed as a system that remembers, computes and learns, all at an accelerating pace as one invention enables the next:
Ilya Sutskever
Jeff Hawkins
Google DeepMind
AlphaGo,
stock picking,
Intelligence, defined as ability to accomplish complex goals, can’t be measured by a single IQ, only by an ability spectrum across all goals.
Any matter can be computronium, the substrate for computation,
Once technology gets twice as powerful, it can often be used to design and build technology that’s twice as powerful in turn, triggering repeated capability doubling in the spirit of Moore’s law.
If we don’t change direction soon, we’ll end up where we’re going. Irwin Corey
flunked,
harbingers
deep reinforcement learning.
if you’re a robot, life itself can be viewed as a game.
Robots
To speed things up and reduce the risk of getting stuck or damaging themselves during the learning process, they would probably do the first stages of their learning in virtual reality.
GOFAI—which
“Good Old-Fashioned AI”
Since the Turing test is fundamentally about deception, it has been criticized for testing human gullibility more than true artificial intelligence.

