October 2018
Intermediate to advanced
172 pages
4h 6m
English
In this section, we will learn how to tune the hyperparameters of the AdaBoost classifier. The AdaBoost classifier has only one parameter of interest—the number of base estimators, or decision trees.
We can optimize the hyperparameters of the AdaBoost classifier using the following code:
from sklearn.model_selection import GridSearchCV#Creating a grid of hyperparametersgrid_params = { 'n_estimators': [100,200,300]}#Building a 3 fold CV GridSearchCV objectgrid_object = GridSearchCV(estimator = ada_boost, param_grid = grid_params, scoring = 'accuracy', cv = 3, n_jobs = -1)#Fitting the grid to the training datagrid_object.fit(X_train, y_train)#Extracting the best parametersgrid_object.bestparams ...Read now
Unlock full access