# CHAPTER 9

# MULTICOLLINEARITY

## 9.1 INTRODUCTION

The use and interpretation of a multiple regression model often depends explicitly or implicitly on the estimates of the individual regression coefficients. Some examples of inferences that are frequently made include the following:

- Identifying the relative effects of the regressor variables
- Prediction and/or estimation
- Selection of an appropriate set of variables for the model

If there is no linear relationship between the regressors, they are said to be **orthogonal**. When the regressors are orthogonal, inferences such as those illustrated above can be made relatively easily. Unfortunately, in most applications of regression, the regressors are not orthogonal. Sometimes the lack of orthogonality is not serious. However, in some situations the regressors are nearly perfectly linearly related, and in such cases the inferences based on the regression model can be misleading or erroneous. When there are **near-linear dependencies** among the regressors, the problem of **multicollinearity** is said to exist.

This chapter will extend the preliminary discussion of multicollinearity begun in Chapter 3 and discuss a variety of problems and techniques related to this problem. Specifically we will examine the causes of multicollinearity, some of its specific effects on inference, methods of detecting the presence of multicollinearity, and some techniques for dealing with the problem.

## 9.2 SOURCES OF MULTICOLLINEARITY

We write the multiple regression model ...