Bagging

Bagging stands for Boostap AGGregatING. This was invented by Breiman (1994). Bagging is an example of an homogeneous ensemble and this is because the base learning algorithm remains as the classification tree. Here, each bootstrap tree will be a base learner. This also means that when we bootstrapped the linear regression model in Chapter 2, Bootstrapping, we actually performed an ensemble there. A few remarks with regards to combining the results of multiple trees is in order here.

Ensemble methods combine the outputs from multiple models, also known as base learners, and produce a single result. A benefit of this approach is that if each of these base learners possesses a desired property, then the combined result will have increased ...

Get Hands-On Ensemble Learning with R now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.