Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as well as how it contrasts with the conventional approach. The theory is built up step by step, and important notions such as sufficiency are brought out of a discussion of the salient features of specific examples. This More and more students are realizing that they need to learn Bayesian statistics to meet their academic and professional goals. This book is best suited for use as a main text in courses on Bayesian statistics for third and fourth year undergraduates and postgraduate students.
I have had the second edition on my shelf for more than ten years. From time to time I pick it up and start reading. But every time there is a point where I just have to stop, because I cannot understand anything anymore. This point keeps creeping forwards, so I am not completely hopeless. Unfortunately, the world has gone forward, and the second edition is sort of outdated. More modern books would put MCMC methods more in the centre, not just as an after thought at the end of the book. So, I must find another Bayesian book I should try to read for the next ten years…
Is it me or the book is very hard to grasp. Every time I pick it to read about a concept there, I get lost in mathematical notation with no clear explanation.