The two most basic alternatives to white noise are the autoregressive and moving average models for error. These are mathematical models that: (i) produce stationary noise, (ii) are based on what many view as realistic dependence structures between nearby observations, and (iii) have simple, closed form, representations. In each case there are still normal (Gaussian) errors, but there is some form of linkage between observations close in time. In this chapter the autoregressive models of order 1 and 2 [AR(1) and AR(2)] are discussed. Moving average models of order 1 and 2 [MA(1) and MA(2)] are discussed in Chapter 6. Together these ideas generalize to autoregressive models of all orders, moving average models of all orders, and autoregressive integrated moving average [ARIMA] models in Chapter 14.

Autoregression is a simple mechanism that is easy to understand and seems like a lot of what we mean by time series that are not white noise.

The simplest idea, AR(1), is just that observations closest in time have a strong correlation (in practice the correlation is almost always positive, but the general theory is the same in either case). Consider noise around a fixed mean; if the noise has an AR(1) structure with positive correlation than when an observation is above average, the next observation is likely to be above average as well, and vice-versa. For example, ...

Start Free Trial

No credit card required