O'Reilly logo

Machine Learning by Sergios Theodoridis

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Chapter 6

The Least-Squares Family

Abstract

In Chapter 6, the sum of least-squares cost function is reconsidered. The LS estimator is rederived via geometric arguments and its properties are discussed. The Ridge regression formulation is viewed via geometric arguments. The SVD matrix factorization method is presented and the concept of low rank matrix approximation is introduced. Emphasis is given on the RLS algorithm for the iterative solution of the LS cost function and its relation to Newton’s optimization method is established. The Coordinate descent scheme for iterative optimization is defined as an alternative to the steepest descent and Newton’s approaches. Finally, the method of total-least-squares is reviewed.

Keywords

Least-squares ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required