# 6

# NONLINEAR REGRESSION

## 6.1 INTRINSIC LINEARITY/NONLINEARITY

In the classical linear regression model, the *multivariate regression hyperplane* has the form

where we have *m* predetermined explanatory variables or *regressors X*_{1}, ..., *X*_{m} and *ε* is a random error term. Here Equation 6.1 depicts a *linear regression model* since it is linear in the parameters *β*_{0}, *β*_{1}, ..., *β*_{m}. It is assumed that *n* > *p* = *m* + 1 and no exact linear relationship exists between the *X*_{j}'s, *j* = 1,..., *m*.

For fixed *X*_{j}'s, *j* = 1,..., *m*, the *population regression hyperplane* is specified as the conditional mean of *Y* given the *X*_{j}'s or

given that *E*(*ε*) = 0. Given Equation 6.2,

that is, the population intercept is the mean of *Y* given that all of the *X*_{j}'s are set equal to zero. Also,

is termed the *j*th *partial regression coefficient*; that is, as *X*_{j} increases by one unit, the average value of *Y* changes by *β*_{j} units, given that all remaining explanatory variables are held constant. Once the *β*_{k}'s, *k* = 0, 1,..., *m*, are estimated from the sample information, we obtain the *sample regression hyperplane*

where *Ŷ* is the estimated value of ...

Get *Growth Curve Modeling: Theory and Applications* now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.