## With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

No credit card required

# Least Squares Estimation of Parameters of Linear Models

## 2.1. Introduction

The purpose of this chapter is to present different methods used to obtain least squares estimates of the parameters of linear models. To illustrate these approaches, we will focus our attention on the “autoregressive” model parameters introduced in Chapter 1.

First, we consider the simple case where the observations are not disturbed by a measurement noise. We present non-recursive techniques, when available samples are processed as blocks of data. This leads to the Yule-Walker equations. These equations can be solved recursively using the Durbin-Levinson algorithm [10]. Thereafter, we take up the recursive least squares (RLS) algorithm, and successively treat cases where the autoregressive process is stationary or non-stationary.

The second part of this chapter deals with cases where the observations are perturbed by an additive white noise. Here, we will first analyze the effect of the measurement noise on the estimation of the AR parameters, and then present nonrecursive and recursive methods which give rise to unbiased estimations of the AR parameters.

## 2.2. Least squares estimation of AR parameters

The least squares method is the starting point of various methods for the identification and estimation of parameters. It was introduced by Gauss in 1809, but is sometimes attributed to Legendre, who worked towards predicting the movements of planets using measurements taken from a telescope [26]. ...

## With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

No credit card required