There has been a dramatic growth in the development and application of Bayesian inferential methods. Some of this growth is due to the availability of powerful simulation-based algorithms to summarize posterior distributions. There has been also a growing interest in the use of the system R for statistical analyses. R's open source nature, free availability, and large number of contributor packages have made R the software of choice for many statisticians in education and industry. Bayesian Computation with R introduces Bayesian modeling by the use of computation using the R language. The early chapters present the basic tenets of Bayesian thinking by use of familiar one and two-parameter inferential problems. Bayesian computational methods such as Laplace's method, rejection sampling, and the SIR algorithm are illustrated in the context of a random effects model. The construction and implementation of Markov Chain Monte Carlo (MCMC) methods is introduced. These simulation-based algorithms are implemented for a variety of Bayesian applications such as normal and binary response regression, hierarchical modeling, order-restricted inference, and robust modeling. Algorithms written in R are used to develop Bayesian tests and assess Bayesian models by use of the posterior predictive distribution. The use of R to interface with WinBUGS, a popular MCMC computing language, is described with several illustrative examples. This book is a suitable companion book for an introductory course on Bayesian methods and is valuable to the statistical practitioner who wishes to learn more about the R language and Bayesian methodology. The LearnBayes package, written by the author and available from the CRAN website, contains all of the R functions described in the book. The second edition contains several new topics such as the use of mixtures of conjugate priors and the use of Zellner’s g priors to choose between models in linear regression. There are more illustrations of the construction of informative prior distributions, such as the use of conditional means priors and multivariate normal priors in binary regressions. The new edition contains changes in the R code illustrations according to the latest edition of the LearnBayes package.
Jim Albert is a Distinguished University Professor of Statistics at Bowling Green State University. His research interests include Bayesian modeling and applications of statistical thinking in sports. He has authored or coauthored several books including Ordinal Data Modeling, Bayesian Computation with R, and Workshop Statistics: Discovery with Data, A Bayesian Approach.
An excellent first five chapters which are pretty well documented and have nice code that works at his website. From there the quality is widely considered to drop off. I personally found the exercises a good way to learn basic things about the techniques and trade-offs involved in sampling from multivariate probability distributions.
A well written guide to Bayesian R computation, one thing which really bugged me was the fact that the list of commands were given at the end of each chapter thus it can be confusing at first because we are not told what the commands are meant to do. A lot of the ideas could also be elaborated more too.