O'Reilly logo

Regression Analysis with Python by Alberto Boschetti, Luca Massaron

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Gradient Boosting Regressor with LAD

More than a new technique, this is an ensemble of technologies already seen in this book, with a new loss function, the Least Absolute Deviations (LAD). With respect to the least square function, seen in the previous chapter, with LAD the L1 norm of the error is computed.

Regressor learners based on LAD are typically robust but unstable, because of the multiple minima of the loss function (leading therefore to multiple best solutions). Alone, this loss function seems to bear little value, but paired with gradient boosting, it creates a very stable regressor, due to the fact that boosting overcomes LAD regression limitations. With the code, this is very simple to achieve:

In: from sklearn.ensemble import GradientBoostingRegressor ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required