9

Variants of Least Mean-Square Algorithm

9.1    THE NORMALIZED LEAST MEAN-SQUARE ALGORITHM

Consider the conventional least mean-square (LMS) algorithm with the fixed step-size parameter µ replaced with a time-varying variable µ(n) as follows (we substitute 2µ with µ for simplicity):

w(n+1)=w(n)+μ(n)e(n)x(n)

(9.1)

Next, define a posteriori error, eps(n), as

eps(n)=d(n)wT(n+1)x(n)

(9.2)

Substituting (9.1) in (9.2), and taking into consideration the error equation e(n) = d(n) − wT (n)x (n), we obtain

eps(n)=[ 1μ(n)xT(n)x(n) ]e(n)xT(n)x(n)= x(n) 2=i=0M1|x(ni)|2

(9.3)

Minimizing eps (n) with respect to µ(n) results in (see Problem 9.1.1)

μ(n)=1 x(n) 2

(9.4)

Substituting (9.4) in (9.1), we find ...

Get Adaptive Filtering now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.