This concise monograph in probability by Mark Kac, a well-known mathematician, presumes a familiarity with Lebesgue's theory of measure and integration, the elementary theory of Fourier integrals, and the rudiments of number theory. Readers may then follow Dr. Kac's attempt "to rescue statistical independence from the fate of abstract oblivion by showing how in its simplest form it arises in various contexts cutting across different mathematical disciplines." The treatment begins with an examination of a formula of Vieta that extends to the notion of statistical independence. Subsequent chapters explore laws of large numbers and Émile Borel's concept of normal numbers; the normal law, as expressed by Abraham de Moivre and Andrey Markov's method; and number theoretic functions as well as the normal law in number theory. The final chapter ranges in scope from kinetic theory to continued fractions. All five chapters are enhanced by problems.
Kac was a very "readable" mathematician. He had a spirit of play and investigation that looked for deep--not merely formal--connections between mathematics and physics especially.
This book "does math". It does not have the dry presentation of Theorem, proof, lemma, "assume that" and a huge inventory of formal symbols. It starts with an amusing derivation of a formula which Kac attributes to Vieta. It takes another look at it from a different, more intuitive point of view. It does this repeatedly until one begins to see how the notion of statistical independence arises. All of this is presented not as some dry formal construct following the Hilbert program or some such but as something living and organic growing naturally from intuitive roots. Kac recognizes what he calls a price that must be paid for "unrestrained abstraction". The price is that "it tends also to divert attention from whole areas of application whose very discovery depends on features that the abstract point of view rules out as being accidental."
This book contains a lot of "recreational" mathematics which the reader can work out himself. Or, in my case, with the aid of Mathematica. Each chapter has problems that one can work out or at least chew on.
It is difficult to go into more detail without symbols. But Kac also writes enough words around his mathematics to give a feel for what's going on, even for the non-expert. So, the first chapter concludes that independence as used by, say, a physicist in concluding that two repetitions of an experiment which are separated by a large physical distance might be assumed to be independent in terms of the application of the rules of probability and thus apply the rule for multiplication of probabilities, is an informal definition. The physicist might be under the impression that such rule follows strictly logically from the nature of the two experiments. According to Kac he is, in fact, applying a definition that seems to be born out universally in terms of experience and experiment. This is to be contrasted with an understanding of independence in a "narrow but well-defined sense."
This latter sense was first laid out by Emil Borel who observed (1909) that the digits of the binary expansion of a number are "independent" and the bulk of the mathematics to this point is to show how this comes about and that it is significant. Interestingly the same conclusion can be reached concerning the digits of a ternary expansion. As Kac points out, these digits are not fraught with difficult properties associated with "coins, events, tosses and experiments." I mention this because I myself have had disputes with so-called probability experts concerning the "tossing" of "fair" coins. How are these to be defined? They seem tied to physical phenomena that is ever changing. For example, after a sufficiently large number of tosses the physical properties of a coin will discernibly change. Are we to continue to maintain that it is fair? Similarly for tossing itself, can we guarantee that our method of tossing will, even if previously agreed to be consistent with fairness, continue to be so?
In this regard, I point the reader to Richard von Mises' analysis of Buffon's Needle in which he argues that different possible probability distributions will be obtained, depending on which random variables are chosen. From this he concludes that it is not at all evident that in an actual experiment the tosses will produce an unmixed probability distribution consistent with the famous 2/pi for the average fraction of crossings. von Mises points out that "large-scale experiments carried out under specified conditions have led to a frequency value closer the result based on [this] assumption. This does not imply that for a different experimental arrangement [a different] assumption would not be more adequate." The determination of which assumption is adequate depends on the nature of the tosses and is not a problem of the probability calculus.