9.1 THE NORMALIZED LEAST MEAN-SQUARE ALGORITHM
Consider the conventional least mean-square (LMS) algorithm with the fixed step-size parameter µ replaced with a time-varying variable µ(n) as follows (we substitute 2µ with µ for simplicity):
(9.1) |
Next, define a posteriori error, eps(n), as
(9.2) |
Substituting (9.1) in (9.2), and taking into consideration the error equation e(n) = d(n) − wT (n)x (n), we obtain
(9.3) |
Minimizing eps (n) with respect to µ(n) results in (see Problem 9.1.1)
(9.4) |
Get Adaptive Filtering now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.