In this paper we have three main results. (a) We show that all jackknife schemes are special cases of generalised bootstrap. (b) We introduce a new generalised bootstrap technique called DBS to estimate the mean-squared error of the least-squares estimate in linear models where the number of parameters tend to infinity with the number of data points, and the error terms are uncorrelated with possibly different variances. Properties of this new resampling scheme are comparable to those of UBS introduced by Chatterjee (1997, Tech. Report No. 2/97, Calcutta). (c) We show that delete-d jackknife schemes belong to DBS or UBS depending on the limit of n-1d. We also study the second-order properties of jackknife variance estimates of the least-squares parameter estimate in regression.
|Original language||English (US)|
|Number of pages||13|
|Journal||Statistics and Probability Letters|
|State||Published - Nov 15 1998|
- Least squares
- Many parameter regression
- Second-order efficiency