Using gradient boosting with Shogun

There is an implementation of the gradient boosting algorithm in the Shogun library but it is restricted, in that it only supports regression tasks. The algorithm is implemented in the CStochasticGBMachine class. The main parameters to configure objects of this class are the base ensemble algorithm model and the loss function, while other parameters include the number of iterations, the learning rate, and the fraction of training vectors to be chosen randomly at each iteration.

We will create an example using gradient boosting for cosine function approximation, assuming that we already have a training and testing dataset available (the exact implementation of a data generator can be found in the source ...

Get Hands-On Machine Learning with C++ now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.