Gradient Boosting is quite a general model: it can deal with both classification and regression tasks. To use it to solve the regression problem all we need to do is to change the objective and the evaluation metric.
For binary classification, we used the binary:logistic objective, but for regression, we just change it to reg:linear. When it comes to evaluation, there are the following built-in evaluation metrics:
- Root-Means-Square Error (set eval_metric to rmse)
- Mean Absolute Deviation (set eval_metric to mae)
Apart from these changes, the other parameters for tree-based models are exactly the same! We can follow the same approach for tuning the parameters, except that now we will monitor a different metric.