8

WHEN THE BOOTSTRAP IS INCONSISTENT AND HOW TO REMEDY IT

For a very wide variety of problems, there are natural ways to bootstrap. Incredibly, most often the methods can be verified by simulation or by asymptotic theory or both. However, there are times when the bootstrap approach that is proposed fails to be consistent. In these cases, it is not obvious why the bootstrap fails and the failure is difficult to diagnose. Experience has shown that often a modification to the bootstrap can improve it.

An example is the 632 estimator of error rate in discriminant analysis. It is a case where the bootstrap works, but a less obvious modification works better. Then, there are situations where the “naïve” bootstrap is inconsistent. Examples of this include the estimation of extreme values, the estimation of a mean when the variance is infinite, and a bootstrap approach to individual bioequivalence. In each of the cases above, there is a modification that makes the bootstrap consistent. A possibly surprising result is that the m-out-of-n bootstrap is a very simple general modification that works as a remedy in several different cases. Although problems are hard to diagnose the jackknife-after-bootstrap is one technique that can help, and we will discuss that in Section 8.7.

8.1 TOO SMALL OF A SAMPLE SIZE

For the nonparametric bootstrap, the bootstrap samples are drawn from a discrete set, namely the original n values. Some exact and approximate results about the bootstrap distribution can ...

Get An Introduction to Bootstrap Methods with Applications to R now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.