At the end of the nineteenth century, the founders of statistical mechanics—Maxwell, Boltzmann, and Gibbs—realized that entropy was also a form of information: entropy is a measure of the number of bits of unavailable information registered by the atoms and molecules that make up the world. The second law of thermodynamics comes about, then, by combining this notion with the fact that the laws of physics preserve information, as we will soon discuss. Nature does not destroy bits. But surely it takes an infinite number of bits of entropy to specify the positions and velocities of even a single
...more