Gradient boosting

Gradient boosting is a type of ensemble learner. This means that a final model based on a set of different models will be created. The predictive capacity of these different models is weak as they are overfitting, but since they are combined in an ensemble the results are far better in terms of the overall outcome. In gradient boosting machines, decision trees are the most common type of weak model used. So, in a nutshell, gradient boosting is a regression- and classification-based machine learning methodology that produces a prediction model as an ensemble of weak prediction models, which is typically decision tree-based. Let's now see how can we define it mathematically.

Any supervised learning algorithm aims to define ...

Get Hands-On Artificial Intelligence on Google Cloud Platform now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.