Entropy was the central concept in Claude Shannon's information theory (which he announced, in nearly complete form, in a single paper in 1948).3 But the roots of entropy go all the way back to Boltzmann and those other thermodynamics dudes in the late 1800s.