The Least-Squares Family
In Chapter 6, the sum of least-squares cost function is reconsidered. The LS estimator is rederived via geometric arguments and its properties are discussed. The Ridge regression formulation is viewed via geometric arguments. The SVD matrix factorization method is presented and the concept of low rank matrix approximation is introduced. Emphasis is given on the RLS algorithm for the iterative solution of the LS cost function and its relation to Newton’s optimization method is established. The Coordinate descent scheme for iterative optimization is defined as an alternative to the steepest descent and Newton’s approaches. Finally, the method of total-least-squares is reviewed.
Get Machine Learning now with O’Reilly online learning.
O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.