The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the metabolic efficiency of neurons, with special reference to visual perception. Evidence from a diverse range of examples is used to show how information theory defines absolute limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, and a list of annotated Further Readings, this book is an ideal introduction to the principles of neural information theory.
A clear, compact, and up-to-date survey of the efficient coding hypothesis and related principles of neural design. An invaluable guide to the background needed to read current primary literature. I particularly appreciated the discussion of color opponency.
Succinct and fun, it demonstrates how information theory can be used to understand neural design. More specifically, they show how data on neural design in visual systems tends to match very well with incredibly simple models optimised for metabolic efficiency (bits/j) and secondarily for computational efficiency (bits/s).