In this chapter we discuss an extension of the approximate nonlinear Bayesian suite of processors that takes a distinctly different approach to the nonlinear Gaussian problem. Instead of attempting to improve on the linearized approximation in the nonlinear XBP (EKF) schemes discussed in the previous section or increasing the order of the Taylor series approximations [111] a modern statistical (linearization) transformation approach is developed. It is founded on the idea that “it is easier to approximate a probability distribution, than to approximate an arbitrary nonlinear function of transformation” [3, 1221]. The classical nonlinear Bayesian processors discussed so far are based on linearizing nonlinear functions of the state and measurements to provide estimates of the underlying statistics (using Jacobians), while the statistical transformation approach is based on selecting a set of sample points that capture certain properties of the underlying distribution. This transformation is essentially a “statistical linearization” technique that incorporates the uncertainty of the prior random variable when linearizing [12]. This set of sample points is then non-linearly transformed or propagated to a new space. The statistics of the new samples are then calculated to provide the required estimates. Note that this method differs from the sampling-resampling approach, in which random samples are drawn from the prior distribution ...

Get Bayesian Signal Processing: Classical, Modern and Particle Filtering Methods now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.