Akaike's information criterion (AIC) is known in the statistics trade as a penalized log-likelihood. If you have a model for which a log-likelihood value can be obtained, then
where p is the number of parameters in the model, and 1 is added for the estimated variance (you could call this another parameter if you wanted to). To demystify AIC let's calculate it by hand. We revisit the regression data for which we calculated the log-likelihood by hand on p. 217.
attach(regression) names(regression)  "growth" "tannin" growth  12 10 8 11 6 7 2 3 3
The are nine values of the response variable, growth, and we calculated the log-likelihood as −23.98941 earlier. There was only one parameter estimated from the data for these calculations (the mean value of y), so p = 1. This means that AIC should be
Fortunately, we do not need to carry out these calculations, because there is a built-in function for calculating AIC. It takes a model object as its argument, so we need to fit a one-parameter model to the growth data like this:
Then we can get the AIC directly:
AIC(model)  51.97882
The more parameters that there are in the model, the better the fit. You could obtain a perfect fit if you had ...