Chapter 15. Multiple Regression

I don’t look at a problem and put variables in there that don’t affect it.

Bill Parcells

Although the VP is pretty impressed with your predictive model, she thinks you can do better. To that end, you’ve collected additional data: you know how many hours each of your users works each day, and whether they have a PhD. You’d like to use this additional data to improve your model.

Accordingly, you hypothesize a linear model with more independent variables:

minutes = α + β 1 friends + β 2 work hours + β 3 phd + ε

Obviously, whether a user has a PhD is not a number—but, as we mentioned in Chapter 11, we can introduce a dummy variable that equals 1 for users with PhDs and 0 for users without, after which it’s just as numeric as the other variables.

The Model

Recall that in Chapter 14 we fit a model of the form:

y i = α + β x i + ε i

Now imagine that each input x i is not a single number but rather a vector of k numbers, x i1 , ... , x ik . The multiple regression model assumes that:

y i = α + β 1 x i1 + . . . + β k x ik + ε i

In multiple regression the vector of parameters is usually called β. We’ll want this to include the constant term as well, which we can achieve by adding a column of 1s to our data:

beta = [alpha, beta_1, ..., beta_k]

and:

x_i = [1, x_i1, ..., x_ik]

Then our model is just:

from scratch.linear_algebra import dot, Vector

def predict(x: Vector, beta: Vector) -> float:
    """assumes that the first ...

Get Data Science from Scratch, 2nd Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.