More on this book
Community
Kindle Notes & Highlights
Read between
May 25 - June 1, 2021
what we call “infinite” often is nothing more than something that we have not yet counted, or understood.
There is another infinity that disorients our thinking: the infinite spatial extension of the cosmos.
Current measurements indicate that the size of the cosmos must be larger than a hundred billion light-years.
It is around 1060 times greater than the Planck length, a number of times that is given by a 1 followed by sixty zeroes.
The cosmological scale is reflected in the value of the cosmological constant Λ, which enters into the basic equations of our theories. The fundamental theory contains, therefore, a very large number: the ratio between the cosmological constant and the Planck length.
The central point is rebellion against the renunciation of the desire to know. A declaration of faith in the comprehensibility of the world, a proud retaliation to those who remain satisfied with their own ignorance, who call “infinite” that which we don’t understand and delegate knowledge elsewhere.
Many scientists suspect today that the concept of “information” may turn out to be a key for new advances in physics. “Information” is mentioned in the foundations of thermodynamics, the science of heat, the foundation of quantum mechanics, and in other areas besides, with the word quite often used very imprecisely.
Before anything else, what is information? The word “information” is used in common parlance to mean a variety of different things, and this imprecision is a source of confusion in science as well.
The scientific notion of information, however, was defined with clarity in 1948, by the American mathematician and engineer Claude Shannon, and is something very simple: information is the measure of the number of possible alternatives for something.
Instead of the number of alternatives N, scientists measure information in terms of a quantity called S, for “Shannon information.” S is defined as the logarithm in base 2 of N: S = log2 N. The advantage of using the logarithm is that the unit of measurement S = 1 corresponds to N = 2 (because 1 = log2 2), making the unit of information the minimum number of alternatives: the choice between two possibilities. This unit of measurement is called “bit.”
Two bits of information correspond to four alternatives (red even, red uneven, black even, black uneven). Three bits of information correspond to eight alternatives.
Why is the notion of information useful, perhaps even fundamental, to understanding the world? For a subtle reason: because it measures the ability of one physical system to communicate with another physical system.
the way in which the atoms arrange themselves is correlated with the way other atoms arrange themselves. Therefore a set of atoms can have information, in the technical precise sense described previously, about another set of atoms. This, in the physical world, happens continuously and throughout, in every moment and in every place: the light that arrives at our eyes carries information about the objects it has played across; the color of the sea has information on the color of the sky above it; a cell has information about the virus that is attacking it; a new living being has plenty of
...more
The world isn’t, then, just a network of colliding atoms: it is also a network of correlations between sets of atoms, a network of real, reciprocal information between physical systems.
Heat is the random, microscopic movement of molecules: when the tea is hotter, the movement of the molecules is more agitated. Why does it cool down? Boltzmann hazarded a splendid hypothesis: because the number of possible states of the molecules in hot tea and cold air is smaller than the number in cool tea and slightly warmer air. The combined state evolves from a situation where there are fewer possible states, to a situation where there are more possible states. The tea can’t warm itself up, because information cannot increase by itself.
S = k log W which expresses (missing) information as the logarithm of the number of alternatives, Shannon’s key idea.
Entropy is “missing information,” that is, information with a minus sign. The total amount of entropy can only increase, because information can only diminish.
Remember that a key result of quantum mechanics is precisely the fact that information is finite.
The entire formal structure of quantum mechanics can be in large measure expressed in two simple postulates:1 The relevant information in any physical system is finite. You can always obtain new information on a physical system.
The first postulate characterizes the granularity of quantum mechanics: the fact that a finite number of possibilities exists.
The second characterizes its indeterminacy: the fact that there is always something unpredictable that allows us to obtain new information.
When we acquire new information about a system, the total relevant information cannot grow indefinitely (because of the first postulate), and part of the previous information becomes irrelevant, that is to say,...
This highlight has been truncated due to consecutive passage length restrictions.
When information enters into a black hole, it is no longer recoverable from outside. But the information that enters the black hole carries with it the energy by which the black hole becomes larger and increases its area. Viewed from outside, the information lost in the black hole now appears as entropy associated with the area of the hole.
Up and down are meaningful near a large mass, like a planet. “Down” indicates the direction toward which the large near mass exerts gravitational pull; “up” indicates the opposite direction. The same goes for “hot” and “cold”: there are no “hot” or “cold” things at the microscopic level, but when we put together a large number of microscopic constituents and describe them in terms of averages, then the notion of “heat” appears: a hot body is a body where the average speed of single constituents is raised.
Something similar must apply to “time.” If the notion of time has no role to play at an elementary level, it certainly plays a significant role in our lives, just as “up” and “hot” do. What does “the passage of time” mean, if time plays no part in the fundamental description of the world?
The answer is simple. The origin of time may be similar to that of heat: it comes from averages of many microscopic variables.
The salient characteristic of time is that it moves forward and not backward, that is to say, there are irreversible phenomena.
When the stone reaches the ground, it stops, you might object: if you watch the film reversed, you see a stone leaping up from the ground by itself, and this is implausible. But when the stone reaches the ground and stops, where does its energy go? It heats the ground! At the precise moment when heat is produced, the process is irreversible: the past differs from the future. It is always heat and only heat that distinguishes the past from the future.
A burning candle is transformed into smoke, the smoke cannot transform into a candle—and a candle produces heat.
Whenever you consider a phenomenon certifying the passage of time, it is through the production of heat that it does so. There is no preferred direction of time without heat.
The idea of thermal time reverses this observation. That is to say, instead of inquiring how time produces dissipation in heat, it asks how heat produces time.
Hence time is not a fundamental constituent of the world, but it appears because the world is immense, and we are small systems within the world, interacting only with macroscopic variables that average among innumerable small, microscopic variables.
I believe that in order to understand reality, we have to keep in mind that reality is this network of relations, of reciprocal information, that weaves the world. We slice up the reality surrounding us into “objects.” But reality is not made up of discrete objects. It is a variable flux.
But if we are certain of nothing, how can we possibly rely on what science tells us? The answer is simple. Science is not reliable because it provides certainty. It is reliable because it provides us with the best answers we have at present. Science is the most we know so far about the problems confronting us. It is precisely its openness, its constant putting of current knowledge in question, that guarantees that the answers it offers are the best so far available: if you find better answers, these new answers become science.
The answers given by science, then, are not reliable because they are definitive. They are reliable because they are not definitive.
For this reason, science and religion frequently find themselves on a collision course. Not because science pretends to know ultimate answers, but precisely for the opposite reason: because the scientific spirit distrusts whoever claims to be the one having ultimate answers or privileged access to Truth.
Quanta of space mingle with the foam of spacetime, and the structure of things is born from reciprocal information that weaves the correlations among the regions of the world. A world that we know how to describe with a set of equations. Perhaps to be corrected.

