The xgboost package

The xgboost R package is an optimized, distributed implementation of the gradient boosting method. This is an engineering optimization that is known to be efficient, flexible, and portable—see https://github.com/dmlc/xgboost for more details and regular updates. This provides parallel tree boosting, and therefore has been found to be immensely useful in the data science community. This is especially the case given that a great fraction of the competition winners at www.kaggle.org use the xgboost technique. A partial list of Kaggle winners is available at https://github.com/dmlc/xgboost/tree/master/demo#machine-learning-challenge-winning-solutions.

The main advantages of the extreme gradient boosting implementation are shown ...

Get Hands-On Ensemble Learning with R now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.