Chapter 4. Random Forests

The previous chapter introduced bagging as an ensembling technique based on homogeneous base learners, with the decision tree serving as a base learner. A slight shortcoming of the bagging method is that the bootstrap trees are correlated. Consequently, although the variance of predictions is reduced, the bias will persist. Breiman proposed randomly sampling the covariates and independent variables at each split, and this method then went on to help in decorrelating the bootstrap trees.

In the first section of this chapter, the random forest algorithm is introduced and illustrated. The notion of variable importance is crucial to decision trees and all of their variants, and a section is devoted to clearly illustrating ...

Get Hands-On Ensemble Learning with R now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.