In this chapter and the one to follow, we introduce methods to express joint behavior of bivariate data. It is assumed that, at least to some extent, the behavior of one variable is the result of a functional relationship between the two variables. In this chapter, we introduce the linear regression model including its ordinary least squares estimation, and the goodness-of-fit measure for a regression.

Regression analysis is important in order to understand the extent to which, for example, a security price is driven by some more global factor. Throughout this chapter, we will only consider quantitative data since most of the theory presented does not apply to any other data level.

Before advancing into the theory of regression, we note the basic idea behind a regression. The essential relationship between the variables is expressed by the measure of scaled linear dependence, that is, correlation.

In many applications, how two entities behave together is of interest. Hence, we need to analyze their joint distribution. In particular, we are interested in the joint behavior of them, say *x* and *y*, linearly. The appropriate tool is given by the covariance of *x* and *y*. More exactly, we are interested in their correlation expressed by the correlation coefficient explained in Chapter 5. Generally, we know that correlation assumes values between −1 and 1 where the sign indicates the direction of the linear dependence. So, ...

Start Free Trial

No credit card required