More on this book
Community
Kindle Notes & Highlights
But for other languages, including, most famously, Mandarin and Cantonese, tone has primary significance in distinguishing words. So it does in most African languages.
Redundancy—inefficient by definition—serves as the antidote to confusion. It provides second chances. Every natural language has redundancy built in; this is why people can understand text riddled with errors and why they can understand conversation in a noisy room.
Ants deploy their pheromones, trails of chemical information; Theseus unwound Ariadne’s thread. Now people leave paper trails. Writing comes into being to retain information across time and across space.
There is a progression from pictographic, writing the picture; to ideographic, writing the idea; and then logographic, writing the word.
It employs at least fifty thousand symbols, about six thousand commonly used and known to most literate Chinese. In swift diagrammatic strokes they encode multidimensional semantic relationships. One device is simple repetition: tree + tree + tree = forest; more abstractly, sun + moon = brightness and east + east = everywhere. The process of compounding creates surprises: grain + knife = profit; hand + eye = look. Characters can be transformed in meaning by reorienting their elements: child to childbirth and man to corpse.
alphabet (alfabet, alfabeto, ). The alphabet was invented only once. All known alphabets, used today or found buried on tablets and stone, descend from the same original ancestor, which arose near the eastern littoral of the Mediterranean Sea, sometime not much before 1500 BCE, in a region that became a politically unstable crossroads of culture, covering Palestine, Phoenicia, and Assyria. To the east
Havelock focused on the process of converting, mentally, from a “prose of narrative” to a “prose of ideas”; organizing experience in terms of categories rather than events; embracing the discipline of abstraction.
Logic might be imagined to exist independent of writing—syllogisms can be spoken as well as written—but it did not. Speech is too fleeting to allow for analysis. Logic descended from the written word, in Greece as well as India and China, where it developed independently.
There are no syllogisms in Homer. Experience is arranged in terms of events, not categories.
Logic implicates symbolism directly: things are members of classes; they possess qualities, which are abstracted and generalized. Oral people lacked the categories that become second nature even to illiterate individuals in literate cultures: for example, for geometrical shapes. Shown drawings of circles and squares, they named them as “plate, sieve, bucket, watch, or moon” and “mirror, door, house, apricot drying board.”
telephone, radio, and e-mail. Jonathan Miller rephrases McLuhan’s argument in quasi-technical terms of information: “The larger the number of senses involved, the better the chance of transmitting a reliable copy of the sender’s mental state.”
exactness as could be expected in a time when the address, as a specification of place, did not yet exist: At London, Printed by I. R. for Edmund Weaver, & are to be sold at his shop at the great North doore of Paules Church.
The word cony (rabbit) appeared variously as conny, conye, conie, connie, coni, cuny, cunny, and cunnie in a single 1591 pamphlet. Others spelled it differently.
but Viking invaders brought more words from Norse and Danish: egg, sky, anger, give, get.
James Murray in the nineteenth century established a working method based on index cards, slips of paper 6 inches by 4 inches. At any given moment a thousand such slips sat on Simpson’s desk, and within a stone’s throw were millions more, filling metal files and wooden boxes with the ink of two centuries.
at the margins, the question of what qualifies as a word can become impossible to answer.
A clear line cannot be drawn between word and unword.
Either way, he had trouble getting the point across. He grumbled: On two occasions I have been asked,—“Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?” In one case a member of the Upper, and in the other a member of the Lower, House put this question. I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
Precomputation plus data storage plus data transmission usually came out cheaper than ad hoc computation.
And in those rooms, as Babbage nodded over a book of logarithms, one of them interrupted: “Well, Babbage, what are you dreaming about?” “I am thinking that all these Tables might be calculated by machinery,” he replied.
A formal solution to a game—the very idea of such a thing was original. The desire to create a language of symbols, in which the solution could be encoded—this way of thinking was Babbage’s, as she well knew.
It had been an engine of numbers; now it became an engine of information. A.A.L. perceived that more distinctly and more imaginatively than Babbage himself.
Babbage’s interests, straying so far from mathematics, seeming so miscellaneous, did possess a common thread that neither he nor his contemporaries could perceive. His obsessions belonged to no category—that is, no category yet existing. His true subject was information: messaging, encoding, processing.
No sooner did entrepreneurs begin to organize private telegraphy than France banned it outright: an 1837 law mandated imprisonment and fines for “anyone performing unauthorized transmissions of signals from one place to another, with the aid of telegraphic machines or by any other means.”
When the English and the American enterprises opened their doors to the general public, it was far from clear who, besides the police and the occasional chess player, would line up to pay the tariff. In Washington, where pricing began in 1845 at one-quarter cent per letter, total revenues for the first three months amounted to less than two hundred dollars. The next year, when a Morse line opened between New York and Philadelphia, the traffic grew a little faster. “When you consider that business is extremely dull [and] we have not yet the confidence of the public,” a company official wrote,
...more
Far from annihilating time, synchrony extended its dominion. The very idea of synchrony, and the awareness that the idea was new, made heads spin. The New York Herald declared: Professor Morse’s telegraph is not only an era in the transmission of intelligence, but it has originated in the mind an entirely new class of ideas, a new species of consciousness. Never before was any one conscious that he knew with certainty what events were at that moment passing in a distant city—40, 100, or 500 miles off.
Those who used the telegraph codes slowly discovered an unanticipated side effect of their efficiency and brevity. They were perilously vulnerable to the smallest errors. Because they lacked the natural redundancy of English prose—even the foreshortened prose of telegraphese—these cleverly encoded messages could be disrupted by a mistake in a single character. By a single dot, for that matter. For example, on June 16, 1887, a Philadelphia wool dealer named Frank Primrose telegraphed his agent in Kansas to say that he had bought—abbreviated in their agreed code as BAY—500,000 pounds of wool.
...more
Secret writing was as old as writing. When writing began, it was in itself secret to all but the few.
“This statement is false” is meta-language: language about language. Russell’s paradoxical set relies on a meta-set: a set of sets. So the problem was a crossing of levels, or, as Russell termed it, a mixing of types. His solution: declare it illegal, taboo, out of bounds. No mixing different levels of abstraction. No self-reference; no self-containment.
One reason for these misguesses was just the usual failure of imagination in the face of a radically new technology. The telegraph lay in plain view, but its lessons did not extrapolate well to this new device.
A stochastic process is neither deterministic (the next event can be calculated with certainty) nor random (the next event is totally free). It is governed by a set of probabilities. Each event has a probability that depends on the state of the system and perhaps also on its previous history. If for event we substitute symbol, then a natural written language like English or Chinese is a stochastic process. So is digitized speech; so is a television signal. Looking more deeply, Shannon
A message, as Shannon saw, can behave like a dynamical system whose future course is conditioned by its past history.
“The errors, as would be expected, occur most frequently at the beginning of words and syllables where the line of thought has more possibility of branching out.”
“Information can be considered as order wrenched from disorder.”
Shannon told one engineer. “In fact, I find the converse idea, that the human brain may itself be a machine which could be duplicated functionally with inanimate objects, quite attractive.
The orderly states have low probability and low entropy. For impressive degrees of orderliness, the probabilities may be very low. Alan Turing once whimsically proposed a number N, defined as “the odds against a piece of chalk leaping across the room and writing a line of Shakespeare on the board.”
The second law, then, is the tendency of the universe to flow from less likely (orderly) to more likely (disorderly) macrostates.
Speaking for many molecular biologists, Gunther Stent dismissed Dawkins as “a thirty-six-year-old student of animal behavior” and filed him under “the old prescientific tradition of animism, under which natural objects are endowed with souls.”
Monod proposed an analogy: Just as the biosphere stands above the world of nonliving matter, so an “abstract kingdom” rises above the biosphere.
“All life evolves by the differential survival of replicating entities.
The common element was randomness, Chaitin suddenly thought. Shannon linked randomness, perversely, to information. Physicists had found randomness inside the atom—the kind of randomness that Einstein deplored by complaining about God and dice. All these heroes of science were talking about or around randomness.
But why do we say Π is not random? Chaitin proposed a clear answer: a number is not random if it is computable—if a definable computer program will generate it. Thus computability is a measure of randomness.
So we can gauge computability by looking at the size of the algorithm. Given a number—represented as a string of any length—we ask, what is the length of the shortest program that will generate it? Using the language of a Turing machine, that question can have a definite answer, measured in bits.
“At each given moment there is only a fine layer between the ‘trivial’ and the impossible,” Kolmogorov mused in his diary.
The complexity of an object is the size of the smallest computer program needed to generate it. An object that can be produced by a short algorithm has little complexity. On the other hand, an object needing an algorithm every bit as long as the object itself has maximal complexity.
Kolmogorov often said that no one should do mathematics after the age of sixty. He dreamed of spending his last years as a buoy keeper on the Volga, making a watery circuit in a boat with oars and a small sail. When the time came, buoy keepers had switched to motorboats, and for Kolmogorov, this ruined the dream.
A good scientific theory is economical. This was yet another way of saying so.
Every poem is a message, different for every reader.
There is no logical depth in the parts of a message that are sheer randomness and unpredictability, nor is there logical depth in obvious redundancy—plain repetition and copying. Rather, he proposed, the value of a message lies in “what might be called its buried redundancy—parts predictable only with difficulty, things the receiver could in principle have figured out without being told, but only at considerable cost in money, time, or computation.
Why does nature appear quantized? Because information is quantized. The bit is the ultimate unsplittable particle.