Minimizing the cost function

At the core of linear regression, there is the search for a line's equation that it is able to minimize the sum of the squared errors of the difference between the line's y values and the original ones. As a reminder, let's say our regression function is called h, and its predictions h(X), as in this formulation:

Minimizing the cost function

Consequently, our cost function to be minimized is as follows:

Minimizing the cost function

There are quite a few methods to minimize it, some performing better than others in the presence of large quantities of data. Among the better performers, ...

Get Regression Analysis with Python now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.