This new and updated textbook is an excellent way to introduce probability and information theory to students new to mathematics, computer science, engineering, statistics, economics, or business studies. Only requiring knowledge of basic calculus, it begins by building a clear and systematic foundation to probability and information. Classic topics covered include discrete and continuous random variables, entropy and mutual information, maximum entropy methods, the central limit theorem and the coding and transmission of information. Newly covered for this edition is modern material on Markov chains and their entropy. Examples and exercises are included to illustrate how to use the theory in a wide range of applications, with detailed solutions to most exercises available online for instructors.
I decided to read this book to compliment a course I took on Information Theory and Statistics. This book does a good job introducing abstract concepts in Information Theory in a simple and understandable way. I especially liked that every chapter begins with sections which explain new concepts in a way that is easy to intuit.
I really like it because it gives clear intuition to the concepts. Great book for an introduction although the formal mathematics definitions are not maintained.