4 Sequential ensembles: Adaptive boosting

This chapter covers

  • Training sequential ensembles of weak learners
  • Implementing and understanding how AdaBoost works
  • Using AdaBoost in practice
  • Implementing and understanding how LogitBoost works

The ensembling strategies we’ve seen thus far have been parallel ensembles. These include homogeneous ensembles such as bagging and random forests (where the same base-learning algorithm is used to train base estimators) and heterogeneous ensemble methods such as stacking (where different base-learning algorithms are used to train base estimators).

Now, we’ll explore a new family of ensemble methods: sequential ensembles. Unlike parallel ensembles, which exploit the independence of each base estimator, sequential ...

Get Ensemble Methods for Machine Learning now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.