August 2019
Intermediate to advanced
342 pages
9h 35m
English
Now we will use GradientBoostingClassifier, which is based on AlgaBoost.
The algorithm used by the ensemble classifier adopts the boosting technique; it also uses gradient descent to minimize the cost function (represented by the residual error returned by the individual base classifiers, also constituted by decision trees).
In the following code, we can see the gradient-boosting ensemble classifier in action:
from sklearn import ensembleparams = {'n_estimators': 500, 'max_depth': 3, 'subsample': 0.5, 'learning_rate': 0.01, 'min_samples_leaf': 1, 'random_state': 3}clf = ensemble.GradientBoostingClassifier(**params)clf.fit(xtrain, ytrain) y_pred = clf.predict(xtest) print("Accuracy is :")print(metrics.accuracy_score(ytest, ...Read now
Unlock full access