An Introduction to Information Theory Quotes

Rate this book
Clear rating
An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics) An Introduction to Information Theory: Symbols, Signals and Noise by John R. Pierce
803 ratings, 3.90 average rating, 53 reviews
Open Preview
An Introduction to Information Theory Quotes Showing 1-30 of 58
“The amount of information conveyed by the message increases as the amount of uncertainty as to what message actually will be produced becomes greater. A message which is one out of ten possible messages conveys a smaller amount of information than a message which is one out of a million possible messages. The entropy of communication theory is a measure of this uncertainty and the uncertainty, or entropy, is taken as the measure of the amount of information conveyed by a message from a source. The more we know about what message the source will produce, the less uncertainty, the less the entropy, and the less the information.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“The lesson provided by Morse’s code is that it matters profoundly how one translates a message into electrical signals. This matter is at the very heart of communication theory.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Thus, an increase in entropy means a decrease in our ability to change thermal energy, the energy of heat, into mechanical energy. An increase of entropy means a decrease of available energy.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“communication theory grew out of the study of electrical communication,”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Theories become more mathematical or abstract when they deal with an idealized class of phenomena or with only certain aspects of phenomena.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“In fact, network theory might have been developed to explain the behavior of mechanical systems,”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“A branch of electrical theory called network theory deals with the electrical properties of electrical circuits, or networks, made by interconnecting three sorts of idealized electrical structures:”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Statistical mechanics interprets an increase of entropy as a decrease in order or, if we wish, as a decrease in our knowledge.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Further, if there is cybernetics, then someone must practice it, and cyberneticist has been anonymously coined to designate such a person.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Any sequence of decimal digits may occur, but only certain sequences of English letters ever occur, that is, the words of the English language. Thus, it is more efficient to encode English words as sequences of binary digits rather than to encode the letters of the words individually. This again emphasizes the gain to be made by encoding sequences of characters, rather than encoding each character separately.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“semiquaver skinned the feelings of the manifold.” Certainly the division of words into grammatical categories such as nouns, adjectives, and verbs is not our sole guide concerning the use of words in producing English text. What does influence the choice among words when the words used in constructing grammatical sentences are chosen, not at random by a machine, but rather by a live human being who, through long training, speaks or writes English according to the rules of the grammar? This question is not to be answered by a vague appeal to the word meaning. Our criteria in producing English sentences can be very complicated indeed. Philosophers and psychologists have speculated about and studied the use of words and language for generations, and it is as hard to say anything entirely new about this as it is to say anything entirely true. In particular, what Bishop Berkeley wrote in the eighteenth century concerning the use of language is so sensible that one can scarcely make a reasonable comment without owing him credit. Let us suppose that a poet of the scanning, rhyming school sets out to write a grammatical poem. Much of his choice will be exercised in selecting words which fit into the chosen rhythmic pattern, which rhyme, and which have alliteration and certain consistent or agreeable sound values. This is particularly notable in Poe’s “The Bells,” “Ulalume,” and “The Raven.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“A grammar must specify not only rules for putting different types of words together to make grammatical structures; it must divide the actual words of English into classes on the basis of the places in which they can appear in grammatical structures. Linguists make such a division purely on the basis of grammatical function without invoking any idea of meaning. Thus, all we can expect of a grammar is the generation of grammatical sentences, and this includes the example given earlier: “The chartreuse”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“It is best to illustrate entropy first in a simple case. The mathematical theory of communication treats the message source as an ergodic process, a process which produces a string of symbols that are to a degree unpredictable. We must imagine the message source as selecting a given message by some random, i.e., unpredictable means, which, however, must be ergodic. Perhaps the simplest case we can imagine is that in which there are only two possible symbols, say, X and Y, between which the message source chooses repeatedly, each choice uninfluenced by any previous choices. In this case we can know only that X will be chosen with some probability p0 and Y with some probability p1, as in the outcomes of the toss of a biased coin. The recipient can determine these probabilities by examining a long string of characters (X’s, Y’s) produced by the source. The probabilities p0 and p1 must not change with time if the source is to be ergodic.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“This measure of amount of information is called entropy. If”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“The entropy of communication theory is measured in bits. We may say that the entropy of a message source is so many bits per letter, or per word, or per message. If the source produces symbols at a constant rate, we can say that the source has an entropy of so many bits per second.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“information in terms of the number of binary digits rather than in terms of the number of different messages that the binary digits can form. This would mean that amount of information should be measured, not by the number of possible messages, but by the logarithm of this number.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“This would mean that amount of information should be measured, not by the number of possible messages, but by the logarithm of this number.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“The octal system is very important to people who use computers.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“It is, in fact, natural to think that man may be a finite-state machine, not only in his function as a message source which produces words, but in all his other behavior as well. We can think if we like of all possible conditions and configurations of the cells of the nervous system as constituting states (states of mind, perhaps). We can think of one state passing to another, sometimes with the production of a letter, word, sound, or a part thereof, and sometimes with the production of some other action or of some part of an action. We can think of sight, hearing, touch, and other senses as supplying inputs which determine or influence what state the machine passes into next. If man is a finite-state- machine, the number of states must be fantastic and beyond any detailed mathematical treatment. But, so are the configurations of the molecules in a gas, and yet we can explain much of the significant behavior of a gas in terms of pressure and temperature merely. Can we someday say valid, simple, and important things about the working of the mind in producing written text and other things as well? As we have seen, we can already predict a good deal concerning the statistical nature of what a man will write down on paper, unless he is deliberately trying to behave eccentrically, and, even then, he cannot help conforming to habits of his own.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“conditional probability”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Zero-order approximation”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Gottlob Burmann, a German poet who lived from 1737 to 1805, wrote 130 poems, including a total of 20,000 words, without once using the letter R. Further, during the last seventeen years of his life, Burmann even omitted the letter from his daily conversation. In”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Fourier analysis, which makes it possible to represent any signal as a sum of sine waves of various frequencies.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Wiener is a mathematician whose background ideally fitted him to deal with this sort of problem,”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Finally, Hartley stated, in accord with Nyquist, that the amount of information which can be transmitted is proportional to the band width times the time of transmission. But this makes us wonder about the number of allowable current values, which is also important to speed of transmission. How are we to enumerate them?”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“H, the information of the message, as the logarithm of the number of possible sequences of symbols which might have been selected and showed that H = n log s Here n is the number of symbols selected, and s is the number of different symbols in the set from which symbols are selected. This is acceptable in the light of our present knowledge of information theory only if successive symbols are chosen independently and if any of the s symbols is equally likely to be selected. In this case, we need merely note, as before, that the logarithm of s, the number of symbols, is the number of independent 0-or-1 choices that can be represented or sent simultaneously, and it is reasonable that the rate of transmission of information should be the rate of sending symbols per second n, times the number of independent 0-or-1 choices that can be conveyed per symbol. Hartley goes on to the problem of encoding”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“R. V. L. Hartley, the inventor of the Hartley oscillator, was thinking philosophically about the transmission of information at about this time, and he summarized his reflections in a paper, “Transmission of Information,” which he published in 1928.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“this useless component of the signal, which, he said, conveyed no intelligence, as redundant,”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“could have been supplied at the receiver rather than transmitted thence over the circuit.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise

« previous 1