The Information: A History, a Theory, a Flood
Rate it:
Open Preview
Kindle Notes & Highlights
Read between October 14 - October 27, 2017
5%
Flag icon
The written word—the persistent word—was a prerequisite for conscious thought as we understand it.
8%
Flag icon
More sensible ways of ordering words came first and lingered for a long time.
Brian
english was the last language to use logic in sorting. but of course the others were primotive
14%
Flag icon
Human computers had no future, he saw: “It is only by the mechanical fabrication of tables that such errors can be rendered impossible.”
24%
Flag icon
Words are signs. Sometimes they are said to represent things; sometimes the operations by which the mind combines together the simple notions of things into complex conceptions.
32%
Flag icon
Indeed, H is ubiquitous, conventionally called the entropy of a message, or the Shannon entropy, or, simply, the information.
36%
Flag icon
The knowledge, such as it was, emerged from what Shannon called the vector field, the totality of the twenty-five directional vectors.
Brian
borel fields
40%
Flag icon
Where Shannon identified information with entropy, Wiener said it was negative entropy. Wiener was saying that information meant order, but an orderly thing does not necessarily embody much information. Shannon himself pointed out their difference and minimized it, calling it a sort of “mathematical pun.”
40%
Flag icon
We disturb the tendency toward equilibrium.
47%
Flag icon
Randomness might be defined in terms of order—its absence, that is.
50%
Flag icon
Everything we care about lies somewhere in the middle, where pattern and randomness interlace.
60%
Flag icon
“Beauty is in the eye of the beholder, and information is in the head of the receiver,” says Fred Dretske.