Chapter 11: Cross-Validation

The concept of keeping training data and testing data separate is sacrosanct in machine learning and statistics. You should never train a model and test its performance on the same data. Setting data aside for testing purposes has a downside, though: that data has valuable information that you would want to include in training. Cross-validation is a technique that's used to circumvent this problem.

You may be familiar with k-fold cross-validation, but if you are not, we will briefly cover it in this chapter. K-fold, however, will not work on time series. It requires that the data be independent, an assumption that time series data does not hold. An understanding of k-fold will help you learn how forward-chaining ...

Get Forecasting Time Series Data with Facebook Prophet now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.