Chapter 6

The Least-Squares Family

Abstract

In Chapter 6, the sum of least-squares cost function is reconsidered. The LS estimator is rederived via geometric arguments and its properties are discussed. The Ridge regression formulation is viewed via geometric arguments. The SVD matrix factorization method is presented and the concept of low rank matrix approximation is introduced. Emphasis is given on the RLS algorithm for the iterative solution of the LS cost function and its relation to Newton’s optimization method is established. The Coordinate descent scheme for iterative optimization is defined as an alternative to the steepest descent and Newton’s approaches. Finally, the method of total-least-squares is reviewed.

Keywords

Least-squares ...

Get Machine Learning now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.