8Bayes Filter

8.1. Introduction

This chapter proposes to generalize the Kalman filter to cases where the functions are nonlinear and the noise is non-Gaussian. The resulting observer will be called the Bayes filter. Instead of computing for each time k the covariance and the estimate of the state, the Bayes filter directly computes the probability density function of the state vector. As for the Kalman filter, it consists of two parts: the prediction and the correction. In the linear and Gaussian case, the Bayes filter is equivalent to the Kalman filter.

By increasing the level of abstraction, the Bayes filter will allow us to have a better understanding of the Kalman filter, and some proofs become easier and more intuitive. As an illustration, we will consider the smoothing problem where the estimation is made more accurate by taking all future measurements, when available. Of course, the smoothing is mainly used for offline applications.

8.2. Basic notions of probabilities

Marginal density. If x ∈ ℝn and y ∈ ℝm are two random vectors with a joint probability density function π(x, y), note that π(x, y) is a function which associates with images ∈ ℝn × ℝm an element of ℝ+ denoted by images. The marginal density for x is

[8.1]

Note that, to be rigorous, we should have written

but this ...

Get Mobile Robotics, 2nd Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.