December 2018
Beginner to intermediate
682 pages
18h 1m
English
Gradient boosting is one of the competition-winning algorithms that work on the principle of boosting weak learners iteratively by shifting focus towards problematic observations that were difficult to predict in previous iterations and performing an ensemble of weak learners, typically decision trees. It builds the model in a stage-wise fashion as other boosting methods do, but it generalizes them by allowing optimization of an arbitrary differentiable loss function.
Let's start understanding Gradient Boosting with a simple example, as GB challenges many data scientists in terms of understanding the working principle: