AdaBoost

Another technique is called AdaBoost (short for Adaptive Boosting) and it works in a slightly different way to many other classifiers. The basic structure behind this can be a Decision Tree, but the dataset used for training is continuously adapted to force the model to focus on those samples that are misclassified. Moreover, the classifiers are added sequentially, so a new one boosts the previous one by improving performance in those areas where it was not as accurate as expected. At each iteration, a weight factor is applied to each sample to increase the importance of samples that are wrongly predicted and decrease the importance of others. In other words, the model is repeatedly boosted, starting as a very weak learner until ...

Get Machine Learning Algorithms - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.