Performing cross-validation with the boosting method

Similar to the bagging function, adabag provides a cross-validation function for the boosting method, named boosting.cv. In this recipe, we will demonstrate how to perform cross-validation using boosting.cv from the package, adabag.

Getting ready

In this recipe, we continue to use the telecom churn dataset as the input data source to perform a k-fold cross-validation with the boosting method.

How to do it...

Perform the following steps to retrieve the minimum estimation errors via cross-validation with the boosting method:

  1. First, you can use boosting.cv to cross-validate the training dataset:
    > churn.boostcv = boosting.cv(churn ~ ., v=10, data=trainset, mfinal=5,control=rpart.control(cp=0.01))
    

Get R: Recipes for Analysis, Visualization and Machine Learning now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.