Bayesian estimation has as its objective the estimation of successive values of a parameter vector x given an observation vector z. As noted above, it is customary to treat both x and z as random vectors. For the parameter vector, the stochastic assumption is inherent in the equations governing the dynamics of the parameter, where unmodeled effects are added as random noise. For the observation vector one can justify a stochastic nature by assuming that there is always some random measurement noise. The random vector x is assumed to have a known prior density function . This prior distribution includes all that is known and unknown about the parameter vector prior to the availability of any observational data. If the true parameter value of x were known, then the probability density of z is given by the conditional density or likelihood function and the complete statistical properties of z would be known.

Once an experiment has been conducted and a realization of the random variable z is available, one can use Bayes' law to obtain the posterior conditional density of x:

Thus, within the Bayesian framework, the posterior density contains everything there ...

Start Free Trial

No credit card required