14.4 Cramer–Rao Lower Bound

We can now ask the question, what is the BEST (smallest) achievable RMS error for a given scenario (set of dynamic and observation models and state trajectories). Since the observation noise is always present, the best achievable RMS errors will occur when the dynamic model is used without any dynamic noise.

Why are lower bounds useful? Assessing achievable estimation algorithm (tracker) performance may be difficult and a lower bound will give an indication of the performance. In many cases lower bounds on performance are required to determine whether a set of imposed system performance requirements are realistic or not.

Much of the material in this section is based on Refs [1–3], and the references contained therein. We stress here the Cramer–Rao lower bound, but [2] goes beyond this bound to consider a wide variety of different bounds applicable to problems other than the Bayesian estimation methods addressed in this book.

14.4.1 The One-Dimensional Case

Consider the one-dimensional case where the process (dynamic) noise is zero making x deterministic. Then, the only contributor to errors in img will be the observation noise, which is captured by the likelihood function. For a normalized likelihood function, we know that

(14.44) equation

It immediately follows that ...

Get Bayesian Estimation and Tracking: A Practical Guide now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.