The general boosting algorithm
The tree-based ensembles in the previous chapters, Bagging and Random Forests, cover an important extension of the decision trees. However, while bagging provides greater stability by averaging multiple decision trees, the bias persists. This limitation motivated Breiman to sample the covariates at each split point to generate an ensemble of "independent" trees and lay the foundation for random forests. The trees in the random forests can be developed in parallel, as is the case with bagging. The idea of averaging over multiple trees is to ensure the balance between the bias and variance trade-off. Boosting is the third most important extension of the decision trees, and probably the most effective one. It is again ...
Get Hands-On Ensemble Learning with R now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.