Let's continue using the Wine dataset in order to analyze the performance of AdaBoost with different parameters. Scikit-Learn, like almost all algorithms, implements both a classifier AdaBoostClassfier (based on the algorithm SAMME and SAMME.R) and a regressor AdaBoostRegressor (based on the algorithm R2). In this case, we are going to use the classifier, but I invite the reader to test the regressor using a custom dataset or one of the built-in toy datasets. In both classes, the most important parameters are n_estimators and learning_rate (default value set to 1.0). The default underlying weak learner is always a decision tree, but it's possible to employ other models creating a base instance and passing ...
Example of AdaBoost with Scikit-Learn
Get Mastering Machine Learning Algorithms now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.