More on this book
Community
Kindle Notes & Highlights
The Second Law is robust; it depends on the definition of entropy as the logarithm of a volume within the space of states, but not on the precise way in which we choose that volume. Nevertheless, in practice we do make certain choices and not others, so this transparent attempt to avoid the issue is not completely satisfying.
Averaging within small regions of space isn’t a procedure that we hit upon randomly, nor is it a peculiarity of our human senses as opposed to the senses of a hypothetical alien; it’s a very natural thing, given how the laws of physics work.
So even though in principle our choice of how to coarse-grain microstates into macrostates seems absolutely arbitrary, in practice Nature hands us a very sensible way to do it.
RUNNING ENTROPY BACKWARD
The arrow of time isn’t a consequence of the fact that “entropy increases to the future”; it’s a consequence of the fact that “entropy is very different in one direction of time than the other.”
The important thing is that entropy increases in the same temporal direction for everyone within the observable universe, so that they can agree on the direction of the arrow of time.
THE DECONSTRUCTION OF BENJAMIN BUTTON
Why is it difficult/ impossible to choose a state of the universe with the property that, as we evolve it forward in time, some parts of it have increasing entropy and some parts have decreasing entropy?
ENTROPY AS DISORDER
Order appears spontaneously at the macroscopic level, but it’s ultimately a matter of disorder at the microscopic level.
We’re going to have a lot to say about how gravity wreaks havoc with our everyday notions of entropy, but for now suffice it to say that the interaction of gravity with other forces seems to be able to create order while still making the entropy go up—temporarily, anyway. That is a deep clue to something important about how the universe works; sadly, we aren’t yet sure what that clue is telling us.
THE PRINCIPLE OF INDIFFERENCE
We have this large set of microstates, which we divide up into macrostates, and declare that the entropy is the logarithm of the number of microstates per macrostate. Then we are asked to swallow another considerable bite: The proposition that each microstate within a macrostate is “equally likely.”
Statistical mechanics, the version of thermodynamics based on atoms, is essentially probabilistic—we don’t know for sure what is going to happen; we can only argue that certain outcomes are overwhelmingly likely.
Conventional statistical mechanics, in other words, makes a crucial assumption: Given that we know we are in a certain macrostate, and that we understand the complete set of microstates corresponding to that macrostate, we can assume that all such microstates are equally likely. We can’t avoid invoking some assumption along these lines; otherwise there’s no way of making the leap from counting states to assigning probabilities. The equal-likelihood assumption has a name that makes it sound like a dating strategy for people who prefer to play hard to get: the “Principle of Indifference.”
And the Principle of Indifference is basically the best we can do. When all we know is that a system is in a certain macrostate, we assume that every microstate within that macrostate is equally likely. (With one profound exception—the Past Hypothesis—to be discussed at the end of this chapter.) It would be nice if we could prove that this assumption should be true, and people have tried to do that.
The real reason we use the Principle of Indifference is that we don’t know any better. And, of course, because it seems to work.
OTHER ENTROPIES, OTHER ARROWS
As a state evolving through time moves from a low-entropy condition to a high-entropy condition, if we choose to forget everything other than the macrostate to which it belongs, we end up knowing less and less about which state we actually have in mind. In other words, if we are told that a system belongs to a certain macrostate, the probability that it is any particular microstate within that macrostate decreases as the entropy increases, just because there are more possible microstates it could be. Our information about the state—how accurately we have pinpointed which microstate it is—goes
...more
Instead of thinking of entropy as something that characterizes individual states—namely, the number of other states that look macroscopically similar—we could choose to think of entropy as characterizing what we know about the state.
In the Boltzmann way of thinking about entropy, the knowledge of which macrostate we are in tells us less and less about the microstate as entropy increases; the Gibbs approach inverts this perspective and defines entropy in terms of how much we know. Instead of starting with a coarse-graining on the space of states, we start with a probability distribution: the percentage chance, for each possible microstate, that the system is actually in that microstate right now. Then Gibbs gives us a formula, analogous to Boltzmann’s, for calculating the entropy associated with that probability
...more
Neither the Boltzmann formula nor the Gibbs formula for entropy...
This highlight has been truncated due to consecutive passage length restrictions.
To derive something like the Second Law from the Gibbs approach, you have to “forget” something about the evolution. When you get right down to it, that’s philosophically equivalent to the coarse-graining we had to do in the Boltzmann approach; we’ve just moved the “forgetting” step to the equations of motion, rather than the space of states.
And that’s not the end of it; there are several other ways of thinking about entropy, and new ones are frequently being proposed in the literature.
After quantum mechanics came on the scene, John von Neumann proposed a formula for entropy that is specifically adapted to the quantum context.
As we’ll discuss in the next chapter, Claude Shannon suggested a definition of entropy that was very similar in spirit to Gibbs’s, but in the framework of information theory rather than physics.
The point is not to find the one true definition of entropy; it’s to come up with concepts that serve useful functions in the appropriate contexts. Just don’t let anyone bamboozle you by pretending that one definitio...
This highlight has been truncated due to consecutive passage length restrictions.
We’ve been dealing with the thermodynamic arrow of time, the one defined by entropy and the Second Law. There is also the cosmological arrow of time (the universe is expanding), the psychological arrow of time (we remember the past and not the future), the radiation arrow of time (electromagnetic waves flow away from moving charges, not toward them), and so on.
PROVING THE SECOND LAW
WHEN THE LAWS OF PHYSICS AREN’T ENOUGH
THE PAST HYPOTHESIS
Why did the universe have a low entropy near the Big Bang?
9
INFORMATION AND LIFE
Of all the ways in which the arrow of time manifests itself, memory—and in particular, the fact that it applies to the past but not the future—is the most obvious, and the most central to our lives.
This implies that our ability to remember the past but not the future must ultimately be explained in terms of entropy, and in particular by recourse to the Past Hypothesis that the early universe was in a very low-entropy state.
PICTURES AND MEMORIES
We can reconstruct the past accurately only by assuming a Past Hypothesis, in addition to knowledge of our current macrostate.
by demanding that our history stretch from a low-entropy beginning to here, we dramatically restrict the space of allowed trajectories, leaving us with those for which our records are (for the most part) reliable reflections of the past.
COGNITIVE INSTABILITY
The truth is, we don’t have any more direct empirical access to the past than we have to the future, unless we allow ourselves to assume a Past Hypothesis.
Indeed, the Past Hypothesis is more than just “allowed”; it’s completely necessary, if we hope to tell a sensible story about the universe.
Without the Past Hypothesis, we simply can’t tell any intelligible story about the world; so we seem to be stuck with it, or stuck with trying to find a theory that actually explains it.
CAUSE AND EFFECT
MAXWELL’S DEMON
RECORDING AND ERASING
The act of erasing information necessarily transfers entropy to the outside world.
INFORMATION IS PHYSICAL
Information is physical. More concretely, possessing information allows us to extract useful work from a system in ways that would have otherwise been impossible.
Shannon was interested in finding efficient and reliable ways of sending signals across noisy channels. He had the idea that some messages carry more effective information than others, simply because the message is more “surprising” or unexpected. If I tell you that the Sun is going to rise in the East tomorrow morning, I’m not actually conveying much information, because you already expected that was going to happen. But if I tell you the peak temperature tomorrow is going to be exactly 25 degrees Celsius, my message contains more information, because without the message you wouldn’t have
...more

