Entropy, by Boltzmann’s reasoning, is simply the number of indistinguishable ways the constituent parts of a system can be arranged. To say entropy increases in any given system is another way of saying that any given system evolves into ever-more-likely distributions or configurations. The second law of thermodynamics is true for the same reason that when a pack of cards arranged in suits is shuffled, it will end up jumbled. There are many more indistinguishable ways for the pack to be disordered than there are for it to end up ordered, and so shuffling takes it in that direction.

