
80 Image Statistics
and thus
I(X, Y ) = H(X) − H(X | Y ). (2.115)
Mutual information measures the degree of dependence between the two
images, a value of ze ro indicating statistical independence. This is to be con-
trasted with correlatio n, where a value of zero implies statistical independence
only for normally distributed q uantities; see Theorem 2.7.
In practice the images are quantized, so that if p
1
and p
2
are their nor-
malized histograms (i.e.,
P
i
p
1
(i) =
P
i
p
2
(i) = 1) and p
12
is the normalized
two-dimensional histogram,
P
ij
p
12
(i, j) = 1, then the mutua l information is
I(1, 2) = −
X
ij
p
12
(i, j)
log[p
1
(i)] + log[p
2
(j)] − log[p
12
(i, j)]
=
X
ij
p
12
(i, j)