Gradient boosting
Gradient boosting is another type of ensemble supervised ML algorithm that can be used for both classification and regression problems. The main reason why algorithms such as random forest, extra trees, gradient boosting, etc., are called ensemble is because a final model is generated based on many individual models. As described under the bagging/boosting comparison section, gradient boosting will train many models sequentially by placing more weights on instances with erroneous predictions. Therefore, challenging cases are the focus during training. A sequential model training using gradient boosting will gradually minimize a loss function. The loss function is minimized similarly to an artificial neural network model (will ...
Get Machine Learning Guide for Oil and Gas Using Python now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.