6
BOOTSTRAP VARIANTS
In previous chapters, we have introduced some modifications to the nonparametric bootstrap. These modifications were sometimes found to provide improvements especially when the sample size is small. For example, in the error rate estimation problem for linear discriminant functions the 632 estimator and double bootstrap were introduced among other.
In the case of confidence intervals, Hall has shown that the accuracy of both types of percentile method bootstrap confidence intervals can be improved on through bootstrap iteration. If we expect that the data come from an absolutely continuous distribution, kernel methods may be used to smooth the empirical distribution function and have the bootstrap samples taken from the smoothed distribution. Consequently, such an approach is called a smoothed bootstrap.
Even though it is desirable to smooth the empirical distribution in small samples, there is a catch because kernel methods need large samples to accurately estimate the tails of the distribution. There is also a variance/bias trade-off associated with the degree of smoothing. In many applications of kernel methods, cross-validation is used to determine what the smoothing parameter should be. So, in general, it is not clear when smoothing will actually help. Silverman and Young (1987) gave a great deal of attention to this issue.
Sometimes in density estimation or spectral density estimation, the bootstrap itself may be looked at as a technique to decide on ...
Get An Introduction to Bootstrap Methods with Applications to R now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.