Cameron

40%
Flag icon
To the physicist, entropy is a measure of uncertainty about the state of a physical system: one state among all the possible states it can be in. These microstates may not be equally likely, so the physicist writes S = −Σ pi log pi. To the information theorist, entropy is a measure of uncertainty about a message: one message among all the possible messages that a communications source can produce. The possible messages may not be equally likely, so Shannon wrote H = −Σ pi log pi.
The Information: A History, a Theory, a Flood
Rate this book
Clear rating
Open Preview