July 2017
Beginner to intermediate
715 pages
17h 3m
English
So far we have discussed three ways to perform Cross-Validation with XGBoost: hold-out dataset, manual K-Fold, and XGBoost K-Fold. Any of these ways can be used for selecting the best performance.
The implementations from XGBoost are typically better-suited for this task because they can show the performance at each step, and the training process can be manually stopped once you see that the learning curves diverge too much.
If your dataset is relatively large (for example, more that 100k examples), then simply selecting a hold-out dataset may be the best and fastest option. On the other hand, if your dataset is smaller, it may be a good idea to perform the K-Fold Cross-Validation.
Once we have decided on the validation ...