In this chapter, we study some linear statistical models for time series. These models are related to linear regression but account for the correlations that arise between data points in the same time series, in contrast to the standard methods applied to cross-sectional data, in which it is assumed that each data point is independent of the others in the sample.
The specific models we will discuss are:
Autoregressive (AR) models, moving average (MA) models, and autoregressive integrated moving average (ARIMA) models
Vector autoregression (VAR)
These models have traditionally been the workhorses of time series forecasting, and they continue to be applied in a wide range of situations, from academic research to industry modeling.
As a data analyst, you are probably already familiar with linear regressions. If you are not, they can be defined as follows: a linear regression assumes you have independently and identically distributed (iid) data. As we have discussed at length in earlier chapters, this is not the case with time series data. In time series data, points near in time tend to be strongly correlated with one another. In fact, when there aren’t temporal correlations, time series data is hardly useful for traditional time series tasks, such as predicting the future or understanding temporal dynamics.
Sometimes time series tutorials and textbooks or give an undue impression ...