Jump to ratings and reviews
Rate this book

Adaptive Computation and Machine Learning

Boosting: Foundations and Algorithms

Rate this book
An accessible introduction and essential reference for an approach to machine learning that creates highly accurate prediction rules by combining many weak and inaccurate ones.

Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate “rules of thumb.” A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical.

This book, written by the inventors of the method, brings together, organizes, simplifies, and substantially extends two decades of research on boosting, presenting both theory and applications in a way that is accessible to readers from diverse backgrounds while also providing an authoritative reference for advanced researchers. With its introductory treatment of all material and its inclusion of exercises in every chapter, the book is appropriate for course use as well.

The book begins with a general introduction to machine learning algorithms and their analysis; then explores the core theory of boosting, especially its ability to generalize; examines some of the myriad other theoretical viewpoints that help to explain and understand boosting; provides practical extensions of boosting for more complex learning problems; and finally presents a number of advanced theoretical topics. Numerous applications and practical illustrations are offered throughout.

543 pages, Paperback

First published January 1, 2012

7 people are currently reading
88 people want to read

About the author

Robert E. Schapire

6 books1 follower
Robert Elias Schapire is an American computer scientist, former David M. Siegel '83 Professor in the computer science department at Princeton University, and has recently moved to Microsoft Research. His primary specialty is theoretical and applied machine learning. He is Principal Researcher at Microsoft Research in New York City.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
12 (38%)
4 stars
11 (35%)
3 stars
5 (16%)
2 stars
2 (6%)
1 star
1 (3%)
Displaying 1 - 2 of 2 reviews
Profile Image for Chris.
142 reviews41 followers
December 31, 2018
well written, but a pointless topic.


If you want this much detail, here you have it, in readable prose. However, boosting won't perform the magic people promise it will. And the connection to differential geometry (through the Fisher information semi-metric) or optimal pursuit is, in my opinion, pointless.
Profile Image for Jan.
28 reviews5 followers
December 8, 2018
Good introduction, but a tad dated. We know a good deal more now.
Displaying 1 - 2 of 2 reviews

Can't find what you're looking for?

Get help and learn more about the design.