Bootstrapping a Family of Non-linear Regressions

There are two broad applications of bootstrapping to the estimation of parameters in nonlinear models:

  • Select certain of the data points at random with replacement, so that, for any given model fit, some data points are duplicated and others are left out.
  • Fit the model and estimate the residuals, then allocate the residuals at random, adding them to different fitted values in different simulations

Our next example involves the viscosity data from the MASS library, where sinking time is measured for three different weights in fluids of nine different viscosities:


We need to estimate the two parameters b and c and their standard errors.


Here are the results of the straightforward non-linear regression:


Formula: Time ~ b * Viscosity/(Wt - c)

   Estimate   Std. Error  t value  Pr(>| t|)
b   29.4013       0.9155   32.114    < 2e-16    ***
c    2.2182       0.6655    3.333    0.00316    **
Residual standard error: 6.268 on 21 degrees of freedom

Here is a home-made bootstrap which leaves out cases at random. The idea is to sample the indices (subscripts) of the 23 cases at random with replacement:

[1]  4  4  10  10  12  3  23  22  21  13  9  14  8  5  15  14  21  14  12 3 20  14  19

In this realization cases 1 and 2 were left out, case 3 appeared ...

Get The R Book now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.