Chapter 3: Bagging with Random Forests

In this chapter, you will gain proficiency in building random forests, a leading competitor to XGBoost. Like XGBoost, random forests are ensembles of decision trees. The difference is that random forests combine trees via bagging, while XGBoost combines trees via boosting. Random forests are a viable alternative to XGBoost with advantages and limitations that are highlighted in this chapter. Learning about random forests is important because they provide valuable insights into the structure of tree-based ensembles (XGBoost), and they allow a deeper understanding of boosting in comparison and contrast with their own method of bagging.

In this chapter, you will build and evaluate random forest classifiers ...

Get Hands-On Gradient Boosting with XGBoost and scikit-learn now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.