Chapter 9: XGBoost Kaggle Masters
In this chapter, you will learn valuable tips and tricks from Kaggle Masters who used XGBoost to win Kaggle competitions. Although we will not enter a Kaggle competition here, the skills that you will gain can apply to building stronger machine learning models in general. Specifically, you will learn why an extra hold-out set is critical, how to feature engineer new columns of data with mean encoding, how to implement VotingClassifier and VotingRegressor to build non-correlated machine learning ensembles, and the advantages of stacking a final model.
In this chapter, we will cover the following main topics:
Exploring Kaggle competitions
Engineering new columns of data
Building non-correlated ensembles
Stacking ...
Get Hands-On Gradient Boosting with XGBoost and scikit-learn now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.