February 2018
Intermediate to advanced
378 pages
10h 14m
English
Under the standard least squares method, the obtained regression coefficients can vary wildly. We can formulate the least squares regression as an optimization problem:
![]()
What we have on the right here is just an RSS in a form of a scalar product. Tikhonov regularized least squares regression adds an additional penalty term—squared L2 norm of weights vector:
![]()
![]()
where L2 norm and λ is a scalar shrinkage ...
Read now
Unlock full access