Adaptive gradient boosting
Adaptive gradient boosting or AdaBoost is another form of supervised ML boosting algorithm. In adaptive boosting, the weights on the sample observations that are challenging to classify or predict are increased by altering the distribution of the training data set. As opposed to AdaBoost where the sample distribution gets modified, in gradient boosting the distribution does not get modified and the weak learners train on the remaining errors of the strong learners. Remember that a weak learner is defined as a classifier that has a strong correlation with true classification. In other words, it can correctly classify or predict slightly better than random guessing. On the other hand, a strong learner is a classifier with ...
Get Machine Learning Guide for Oil and Gas Using Python now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.