## Chapter 10. Linear Least Squares

The code for this chapter is in `linear.py`

. For information about downloading and
working with this code, see Using the Code.

## Least Squares Fit

Correlation coefficients measure the strength and sign of a
relationship, but not the slope. There are several ways to estimate the
slope; the most common is a **linear least squares
fit**. A “linear fit” is a line intended to model the
relationship between variables. A “least squares” fit is one that
minimizes the mean squared error (MSE) between the line and the
data.

Suppose we have a sequence of points, `ys`

, that we want to express as a function of
another sequence `xs`

. If there is a
linear relationship between `xs`

and
`ys`

with intercept `inter`

and slope `slope`

, we expect each `y[i]`

to be ```
inter +
slope * x[i]
```

.

But unless the correlation is perfect, this prediction is only
approximate. The vertical deviation from the line, or **residual**, is

res = ys - (inter + slope * xs)

The residuals might be due to random factors like measurement error, or nonrandom factors that are unknown. For example, if we are trying to predict weight as a function of height, unknown factors might include diet, exercise, and body type.

If we get the parameters `inter`

and `slope`

wrong, the residuals get
bigger, so it makes intuitive sense that the parameters we want are the
ones that minimize the residuals.

We might try to minimize the absolute value of the residuals, or their squares, or their cubes; but the most common choice is to minimize the sum of ...

Get *Think Stats, 2nd Edition* now with the O’Reilly learning platform.

O’Reilly members experience live online training, plus books, videos, and digital content from nearly 200 publishers.