The jackknife and the bootstrap are nonparametric methods for assessing the errors in a statistical estimation problem. They provide several advantages over the traditional parametric the methods are easy to describe and they apply to arbitrarily complicated situations; distribution assumptions, such as normality, are never made. This monograph connects the jackknife, the bootstrap, and many other related ideas such as cross-validation, random subsampling, and balanced repeated replications into a unified exposition. The theoretical development is at an easy mathematical level and is supplemented by a large number of numerical examples. The methods described in this monograph form a useful set of tools for the applied statistician. They are particularly useful in problem areas where complicated data structures are common, for example, in censoring, missing data, and highly multivariate situations.
I have probably 200-300 reference books on my shelf in applied mathematics and statistics, and of all of those, this small monograph is perhaps the most accessible and directly applicable. Prof. Efron is a legend in the realm of non-parametric statistics, especially the re-sampling plans described here. Unlike the vast majority of books on mathematical statistics, this one uses essentially no calculus or higher math - just some results in series and sequences, and relatively conventional arithmetic. A great primer on these fascinating and powerful techniques!