July 2017
Intermediate to advanced
382 pages
9h 13m
English
Implementing a boosted regressor follows the same syntax as the boosted classifier:
In [15]: from sklearn.ensemble import GradientBoostingRegressor... boost_reg = GradientBoostingRegressor(n_estimators=10,... random_state=3)
We have seen earlier that a single decision tree can achieve 79.3% accuracy on the Boston dataset. A bagged decision tree classifier made of 10 individual regression trees achieved 82.7% accuracy. But how does a boosted regressor compare?
Let's reload the Boston dataset and split it into training and test sets. We want to make sure we use the same value for random_state so that we end up training and testing on the same subsets of the data:
In [16]: dataset = load_boston()... X = dataset.data ...
Read now
Unlock full access