O'Reilly logo

The R Book by Michael J. Crawley

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Linear Regression after Transformation

Many mathematical functions that are non-linear in their parameters can be linearized by transformation (see p. 205). The most frequent transformations (in order of frequency of use), are logarithms, antilogs and reciprocals. Here is an example of linear regression associated with a power law (p. 198): this is a two-parameter function

images

where the parameter a describes the slope of the function for low values of x and b is the shape parameter. For b = 0 we have a horizontal relationship y = a, for b = 1 we have a straight line through the origin y = ax with slope = a, for b > 1 the slope is positive but the slope increases with increasing x, for 0 < b < 1 the slope is positive but the slope decreases with increasing x, while for b < 0 (negative powers) the curve is a negative hyperbola that is asymptotic to infinity as x approaches 0 and asymptotic to zero as x approaches infinity.

Let's load a new dataframe and plot the data:

power<-read.table("c:\\temp \\power.txt",header=T)
attach(power)
names(power)

[1] "area" "response"

par(mfrow=c(1,2))
plot(area,response,pch=16)
abline(lm(response~area))
plot(log(area),log(response),pch=16)
abline(lm(log(response)~log(area)))

The two plots look very similar in this case (they don't always), but we need to compare the two models.

model1<-lm(response~area)
model2<-lm(log(response)~log(area))

summary(model2) ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required