18
Predictions
In this short chapter, we shall study the basic results in prediction theory. Of particular interest
are (i) the general expressions for one-step and p-step ahead predictions of the parametric
model discussed in Chapter 17 and (ii) the predictor model description for an LTI system.
18.1 INTRODUCTION
One of the central aims of modeling is prediction. To this eect, models are trained to achieve as
good predictions as possible by choosing the decision variables (model parameters) so that a chosen
norm of prediction errors (typically the squared 2-norm) is minimized. Therefore, it is important to
formally study the concept of prediction and learn the mathematics of developing predictor expres-
sions for a given model structure. This chapter is devoted to this purpose.
The fundamental prediction problem is stated as follows.
Given an information set, known as the data set, Z, find the best prediction of a (random)
variable Y.
The data set Z comprises measurements of all variables that are potentially informative about Y. In
time-series forecasting for instance, Z consists of N past measurements of Y , while Y itself is y[k].
The prediction problem assumes dierent forms depending on the application:
i. Identification: Z consists of past observations of input-output data and y[k] is the output at the
present instant.
ii. Smoothing: Z consists of past as well as future data and y[k] is the signal at the present instant.
This is of course suited only for oine applications.
iii. Static regression: Z is the set of regressors or explanatory variables at the same instant as y[k],
which is the predicted variable.
The main challenge in prediction is, usually, the presence of stochasticity or uncertainty in Z and
/ or Y. Deterministic signals, especially in deterministic chaotic systems, also present considerable
challenges.
Prediction theory is a subject with rich history. The concepts have their roots in the milestone
results of the early twentieth century due to the concerted eorts of several researchers (Wiener,
Kolmogorov, Wold and Cramer to name a few). Gauss’s method of predicting planetary motions us-
ing the LS approach can also be considered as one of the first documented eorts for prediction. In
time-series analysis, forecasting is the technical term for prediction. A cornerstone result in predic-
tion theory is that the conditional expectation is the best predictor in the mean square error sense.
However, this result holds more theoretical importance and less practical utility since the knowledge
of joint p.d.f.s is essential for computations of conditional expectations. Moreover E(Y |Z) can be a
complicated non-linear function of Z. Consequently, a large body of literature is built around deter-
mining the best linear predictors. In the modern era, there has been an emerging growth of theory
of non-linear predictors based on concepts from learning theory (machine, inductive and iterative
learning), Bayesian methods, etc. However, it is far from reaching the maturity of linear prediction
theory.
499

Get Principles of System Identification now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.