This chapter covers a variety of topics that are extensions of the basic Kalman filtering theory. Since model parameters are often poorly known, we first discuss maximum likelihood estimation (MLE) of parameters such as initial conditions, process noise variances, measurement noise variances, and dynamic model constants. System characteristics often change with time, so we next discuss methods that allow filters to adapt to model changes. The simplest adaptive filters use statistics on filter innovations to adjust the process noise covariance, and this approach can work well when changes in system “noise” levels are slow. When system states change value abruptly, a jump detection/estimation method based on hypothesis testing is usually a better approach. A multiple model filter may be appropriate when systems transition between a finite number of models.

Methods for enforcing equality or inequality constraints on filter estimates are also of interest. The methods employed are similar to those used for least-squares estimation, but the problem is somewhat more difficult. Robust estimation is another least-squares topic applicable to filtering. H-infinity filters are designed to minimize estimation errors when input errors are larger than expected. Thus they can track more accurately than Kalman filters when anomalous conditions occur.

The final two topics address alternate methods for nonlinear filtering: unscented Kalman filters and particle ...

Get Advanced Kalman Filtering, Least-Squares and Modeling: A Practical Handbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.