August 2019
Intermediate to advanced
342 pages
9h 35m
English
We will now try to further improve our predictions by using XGBoost, which represents an improved version of the gradient boosting algorithm since it was designed to optimize performance (using parallel computing), thus reducing overfitting.
We will use the XGBClassifier class of the xgboost library, which implements the eXtreme Gradient Boosting Classifier, as shown in the following code:
from sklearn import metricsfrom xgboost.sklearn import XGBClassifierxgb_model = XGBClassifier()xgb_model.fit(xtrain, ytrain, eval_metric=['error'], eval_set=[((xtrain, ytrain)),(xtest, ytest)])y_pred = xgb_model.predict(xtest) print("Accuracy is :")print(metrics.accuracy_score(ytest, y_pred))Accuracy is : 0.999472542929
The accuracy ...
Read now
Unlock full access