3.1 Bayesian Estimation

Bayesian estimation has as its objective the estimation of successive values of a parameter vector x given an observation vector z. As noted above, it is customary to treat both x and z as random vectors. For the parameter vector, the stochastic assumption is inherent in the equations governing the dynamics of the parameter, where unmodeled effects are added as random noise. For the observation vector one can justify a stochastic nature by assuming that there is always some random measurement noise. The random vector x is assumed to have a known prior density function img. This prior distribution includes all that is known and unknown about the parameter vector prior to the availability of any observational data. If the true parameter value of x were known, then the probability density of z is given by the conditional density or likelihood function img and the complete statistical properties of z would be known.

Once an experiment has been conducted and a realization of the random variable z is available, one can use Bayes' law to obtain the posterior conditional density of x:

(3.1) equation

Thus, within the Bayesian framework, the posterior density contains everything there ...

Get Bayesian Estimation and Tracking: A Practical Guide now with the O’Reilly learning platform.

O’Reilly members experience live online training, plus books, videos, and digital content from nearly 200 publishers.