July 2017
Beginner to intermediate
715 pages
17h 3m
English
Gradient Boosting Machines (GBM) is an ensembling algorithm. The main idea behind GBM is to take some base model and then fit this model, over and over, to the data, gradually improving the performance. It is different from Random Forest models because GBM tries to improve the results at each step, while random forest builds multiple independent models and takes their average.
The main idea behind GBM can be best illustrated with a Linear Regression example. To fit several linear regressions to data, we can do the following:
Read now
Unlock full access