December 2018
Beginner to intermediate
684 pages
21h 9m
English
The least squares methods to train a linear regression model will produce the best, linear, and unbiased coefficient estimates when the Gauss-Markov assumptions are met. Variations like GLS fare similarly well even when OLS assumptions about the error covariance matrix are violated. However, there are estimators that produce biased coefficients to reduce the variance to achieve a lower generalization error overall.
When a linear regression model contains many correlated variables, their coefficients will be poorly determined because the effect of a large positive coefficient on the RSS can be canceled by a similarly large negative coefficient on a correlated variable. Hence, the model ...