More on this book
Kindle Notes & Highlights
Entropy, by Boltzmann’s reasoning, is simply the number of indistinguishable ways the constituent parts of a system can be arranged. To say entropy increases in any given system is another way of saying that any given system evolves into ever-more-likely distributions or configurations. The second law of thermodynamics is true for the same reason that when a pack of cards arranged in suits is shuffled, it will end up jumbled. There are many more indistinguishable ways for the pack to be disordered than there are for it to end up ordered, and so shuffling takes it in that direction.
Entropy increases with time, therefore, because the chances of its decreasing are tiny. In fact, and this is the mind-blowing part of Boltzmann’s logic, only by observing entropy increase can we tell the direction of time. We differentiate the future from the past because in the future the overall entropy is greater. So, by trying to understand heat in terms of atoms, Boltzmann had uncovered what lay behind William Thomson’s discovery of time’s arrow. Imagine you’re watching a film that shows the heat in a kitchen flowing back into an oven, or a film showing the milk in a teacup separating
...more
Boltzmann had explained this as a natural consequence of countless collisions between air molecules that individually follow the same rules of physics that billiard balls do when they collide. Therein, claimed Loschmidt, lay the paradox. The laws that describe each individual molecular collision are reversible. They are completely symmetrical in time. To see why, imagine watching a film that shows a close-up collision between two billiard balls. One ball enters from the left and strikes a stationary one. The first ball stops, and the second ball exits to the right. Now imagine what you’d see
...more
This highlight has been truncated due to consecutive passage length restrictions.
S = klnW Now considered one of the foundational statements of physics, this formulation is inscribed on Boltzmann’s grave in Vienna. It’s a mathematical statement that means the entropy (S) of any system is the number of indistinguishable arrangements it can take.
Meanwhile, Josiah Willard Gibbs had also been busy in his Yale study. He’d realized that the laws of thermodynamics could bring a deep, new understanding to the field of chemistry. Above all else, Gibbs would give future generations of scientists a framework for understanding the chemical processes that occur within living organisms.
Gibbs’s insight was to find a way showing how the two laws of thermodynamics drive all chemical reactions. He chose to start his argument with a restatement of those laws, so let’s follow his lead: First law: The energy in the universe is constant. Second law: The entropy of the universe tends to increase. Gibbs then showed how all processes of change can be judged by these two laws. He did this, essentially, by turning the two laws into one new law we can call Gibbs’s law: The flow of energy is the means by which the entropy of the universe is increased.
Gibbs’s equation allows us to tot up all the entropy changes in different parts of the universe to reveal a marketplace—one in which one bit of the universe pays other bits of the universe for a highly desirable commodity—a local and temporary reduction in entropy. And it does so with a specific and well-defined currency—energy.
When hydrogen burns in oxygen, a great deal of heat is dispersed, far more than is needed to compensate for the fall in entropy caused by creating the steam. This excess or “free” energy can be used to produce mechanical work—such as drive a car engine. But it can also be used to force other chemical reactions to proceed in the reverse or “nonspontaneous” direction, just as heat flow in one house reversed it in the other. In this context, the available energy is often referred to as Gibbs free energy, and it’s the means by which chemical reactions are coupled.
Shannon wanted a way of measuring information that could be universally agreed upon and independent of the encoding method. To do that, he argued that the size of a piece of information must be the smallest number of bits needed to encode the message. With the message help, is it possible to use fewer than twenty bits to send it? The answer is yes if you account for the fact that some letters appear far more frequently than others in any written English text.
The scientist Rolf Landauer captured this idea with the phrase “information is physical.” All forms of information require a change in the physical universe. Written words require marks to be made on a physical medium of some kind. But even a spoken word requires the movement of vocal cords, which make air molecules vibrate. Similarly, a thought requires electrochemical changes in the neurons in our brains. In this sense, information entropy is constrained by thermodynamic entropy. As physical systems decay, so does any information they carry. Imagine writing your name in the sand on a beach.
...more
The Landauer limit, though, is a tiny amount. Real transistors dissipate 10 billion times as much heat. But knowing this ideal minimum heat dissipated when a bit is erased is invaluable because it tells us that the laws of physics permit us to do considerably better than our current silicon-based technology. We may never build a useful computer that erases bits at a rate of heat dissipation as low as the Landauer limit, but knowing it tells us that we can, in principle, reduce the heat coming off our chips by factors of thousands if not millions. The other reason to believe it is possible to
...more
This highlight has been truncated due to consecutive passage length restrictions.
With time to think, Turing focused on what he felt was the fascinating confluence of mathematics, computing, and biology. From 1947 to 1948, he wrote groundbreaking papers on the way nerve cells in the brain might work and on how machines might emulate that process. Then in 1948, Max Newman, also a code breaker from Bletchley Park and now professor of mathematics at Manchester University, recruited Turing. Newman had also secured funding to research and build computers and knew, rightly, that Turing’s expertise would be invaluable. The machines that Turing and his colleagues then developed in
...more
The paper in Mind demonstrates Turing’s long-term interest in the following question: If “dumb” electrical circuits in a computer could perform mathematical tasks previously only carried out by human minds, was it possible that similar “dumb” processes ultimately underpinned all the ways those minds worked? Though, of course, the components in a brain’s circuit would be interacting chemicals in nerve cells rather than electrical valves and relays.
As Einstein put it, “For an observer in free fall… there exists, during his fall, no gravitational field.” This observation, that being in free fall is indistinguishable from being in a region of zero gravity, is called the principle of equivalence.
The earth, in other words, appears to measure the distance from its own center to every object nearby, as well as measuring the mass of all these objects. Then it has to calculate the exact amount of force to exert on each of those objects and the direction in which those forces must act. It then instantly transmits that force to all objects falling downward. It’s a preposterous idea. Indeed, the first person to point out quite how preposterous was Newton himself. In a letter shortly after he published his theory of gravity, he said, “That gravity should be innate, inherent, and essential to
...more
This highlight has been truncated due to consecutive passage length restrictions.
The earth’s mass curves what was flat space toward the point N. That means that Alice, though she thinks she’s moving forward in time and not moving in space, is in fact following the curved line from point A to point N. And though Bob, like Alice, thinks he’s moving forward in time and not moving in space, he, too, is following a curved line, from point B to N. So, Alice and Bob move inexorably closer to each other and to point N not because they’re being pulled by a gravitational force, but because that’s the path they must follow in curved space if they remain still. So the key idea to
...more
To follow Hawking’s exact reasoning is tricky. For a rough intuition for what he did, we must consider one of the most outlandish consequences of the famous “uncertainty principle” of quantum physics—something known as “vacuum energy.” As the name implies, far from being inert, the vacuum is seething with activity. At any instant, bursts of energy appear from nowhere by borrowing equivalent bursts of energy from a tiny instant in the future. Mostly, we are unaware of these fluctuations because the positive burst of energy that appears at one instant is cancelled out by the negative burst that
...more
A rabbit had been pulled out of the physics hat. Between them, Hawking and Bekenstein had shown that the three great ideas of modern physics—general relativity, quantum mechanics, and thermodynamics—work in harmony. For these reasons, black hole entropy and radiation have come to dominate contemporary physics as scientists search for a so-called grand unified theory, the Holy Grail of a single principle that explains nature—the world, the universe, everything—at its most fundamental level.
Think of one dot as our Milky Way, and imagine that a circular boundary is drawn around it and its nearby galaxies. Inside this boundary, the balloon is expanding slower than the speed of light, whereas outside it is expanding faster. Only objects inside this boundary are visible to us; everything outside the boundary is invisible. But because the rate at which the balloon is expanding is increasing, more and more of these objects will disappear as they flow across the boundary. This process should sound familiar. It’s as if our universe is “an inside-out black hole.” Instead of stuff flowing
...more
Tyndall discovered something quite extraordinary. Infrared radiation was barely affected by nitrogen and oxygen, the gases that make up 99 percent of the atmosphere. But when the same radiation passed through air containing water vapor or carbon dioxide, even though these gases are present in tiny amounts, the thermometer registered a fall in the temperature reading. The only explanation, argued Tyndall, was that the presence of these gases increases the air’s ability to absorb infrared radiation by a factor of about fifteen. Tyndall had discovered what we now call the greenhouse effect. Water
...more

