Sequences of models – AdaBoost

AdaBoost is a boosting algorithm based on the Gradient Descent optimization method. It fits a sequence of weak learners (originally stumps, that is, single-level decision trees) on re-weighted versions of the data. Weights are assigned based on the predictability of the case. Cases that are more difficult are weighted more. The idea is that the trees first learn easy examples and then concentrate more on the difficult ones. In the end, the sequence of weak learners is weighted to maximize the overall performance:

In: import numpy as np    from sklearn.ensemble import AdaBoostClassifier    hypothesis = AdaBoostClassifier(n_estimators=300, random_state=101)    scores = cross_val_score(hypothesis, covertype_X, covertype_y, ...

Get Python Data Science Essentials - Third Edition now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.