This chapter considers the **simple linear regression model**, that is, a model with a single regressor *x* that has a relationship with a response *y* that is a straight line. This simple linear regression model is

where the intercept *β*_{0} and the slope *β*_{1} are unknown constants and *ε* is a random error component. The errors are assumed to have mean zero and unknown variance *σ*^{2}. Additionally we usually assume that the errors are uncorrelated. This means that the value of one error does not depend on the value of any other error.

It is convenient to view the regressor *x* as controlled by the data analyst and measured with negligible error, while the response *y* is a random variable. That is, there is a probability distribution for *y* at each possible value for *x*. The mean of this distribution is

and the variance is

Thus, the mean of *y* is a linear function of *x* although the variance of *y* does not depend on the value of *x*. Furthermore, because the errors are uncorrelated, the responses are also uncorrelated.

The parameters *β*_{0} and *β*_{1} are usually called **regression coefficients.** These coefficients have a simple and often useful interpretation. ...

Start Free Trial

No credit card required