Chapter 4

Information Criteria: Examples of Applications in Signal and Image Processing

4.1. Introduction and context

In this chapter we will focus on the sequence of N observations xN = (xl, ..., xN) on a stationary and random process consisting of a family of random variables X = {Xn}nimagesZ distributed according to the same unknown law θ. A model θk based on k free parameters will represent this process X. Determining the optimal estimation imagesk of θk in the maximum likelihood (ML) sense enables us to find imagesk which maximizes f(xNk) where f represents the conditional density probability of the observations xN when choosing the model θk. Finding imagesk which will minimize Lk) = −log f(xNk) and therefore images has the same effect.

Even though this criterion of estimation is expressed by a fixed number k, it might be tempting to use this criterion to carry out a simultaneous estimation of the model’s parameters and its number of free parameters, which in a written form can be expressed as follows: ...

Get Optimisation in Signal and Image Processing now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.