June 2020
Intermediate to advanced
382 pages
11h 39m
English
XGBoost was created in 2014 and is based on gradient-boosting principles. It has become one of the most popular ensemble classification algorithms. It generates a bunch of interrelated trees and uses gradient descent to minimize the residual error. This makes it a perfect fit for distributed infrastructures, such as Apache Spark, or for cloud computing, such as Google Cloud or Amazon Web Services (AWS).
Let's now see how we can implement gradient boosting with the XGBoost algorithm:
First, we will instantiate the XGBClassfier classifier and train the model using the training portion of the data:
Then, we will generate predictions based on the newly trained model:
y_pred = classifier.predict(X_test) ...