Another look at the jackknife: Further examples of generalized bootstrap

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

In this paper we have three main results. (a) We show that all jackknife schemes are special cases of generalised bootstrap. (b) We introduce a new generalised bootstrap technique called DBS to estimate the mean-squared error of the least-squares estimate in linear models where the number of parameters tend to infinity with the number of data points, and the error terms are uncorrelated with possibly different variances. Properties of this new resampling scheme are comparable to those of UBS introduced by Chatterjee (1997, Tech. Report No. 2/97, Calcutta). (c) We show that delete-d jackknife schemes belong to DBS or UBS depending on the limit of n-1d. We also study the second-order properties of jackknife variance estimates of the least-squares parameter estimate in regression.

Original languageEnglish (US)
Pages (from-to)307-319
Number of pages13
JournalStatistics and Probability Letters
Volume40
Issue number4
StatePublished - Nov 15 1998

Fingerprint

Jackknife
Bootstrap
Estimate
Least Squares Estimate
Resampling
Error term
Mean Squared Error
Least Squares
Linear Model
Regression
Infinity
Tend
Least squares

Keywords

  • Bootstrap
  • Jackknife
  • Least squares
  • Many parameter regression
  • Second-order efficiency

Cite this

Another look at the jackknife : Further examples of generalized bootstrap. / Chatterjee, Snigdhansu.

In: Statistics and Probability Letters, Vol. 40, No. 4, 15.11.1998, p. 307-319.

Research output: Contribution to journalArticle

@article{a69660f60e8b4f28b0761ff14ab4ef30,
title = "Another look at the jackknife: Further examples of generalized bootstrap",
abstract = "In this paper we have three main results. (a) We show that all jackknife schemes are special cases of generalised bootstrap. (b) We introduce a new generalised bootstrap technique called DBS to estimate the mean-squared error of the least-squares estimate in linear models where the number of parameters tend to infinity with the number of data points, and the error terms are uncorrelated with possibly different variances. Properties of this new resampling scheme are comparable to those of UBS introduced by Chatterjee (1997, Tech. Report No. 2/97, Calcutta). (c) We show that delete-d jackknife schemes belong to DBS or UBS depending on the limit of n-1d. We also study the second-order properties of jackknife variance estimates of the least-squares parameter estimate in regression.",
keywords = "Bootstrap, Jackknife, Least squares, Many parameter regression, Second-order efficiency",
author = "Snigdhansu Chatterjee",
year = "1998",
month = "11",
day = "15",
language = "English (US)",
volume = "40",
pages = "307--319",
journal = "Statistics and Probability Letters",
issn = "0167-7152",
publisher = "Elsevier",
number = "4",

}

TY - JOUR

T1 - Another look at the jackknife

T2 - Further examples of generalized bootstrap

AU - Chatterjee, Snigdhansu

PY - 1998/11/15

Y1 - 1998/11/15

N2 - In this paper we have three main results. (a) We show that all jackknife schemes are special cases of generalised bootstrap. (b) We introduce a new generalised bootstrap technique called DBS to estimate the mean-squared error of the least-squares estimate in linear models where the number of parameters tend to infinity with the number of data points, and the error terms are uncorrelated with possibly different variances. Properties of this new resampling scheme are comparable to those of UBS introduced by Chatterjee (1997, Tech. Report No. 2/97, Calcutta). (c) We show that delete-d jackknife schemes belong to DBS or UBS depending on the limit of n-1d. We also study the second-order properties of jackknife variance estimates of the least-squares parameter estimate in regression.

AB - In this paper we have three main results. (a) We show that all jackknife schemes are special cases of generalised bootstrap. (b) We introduce a new generalised bootstrap technique called DBS to estimate the mean-squared error of the least-squares estimate in linear models where the number of parameters tend to infinity with the number of data points, and the error terms are uncorrelated with possibly different variances. Properties of this new resampling scheme are comparable to those of UBS introduced by Chatterjee (1997, Tech. Report No. 2/97, Calcutta). (c) We show that delete-d jackknife schemes belong to DBS or UBS depending on the limit of n-1d. We also study the second-order properties of jackknife variance estimates of the least-squares parameter estimate in regression.

KW - Bootstrap

KW - Jackknife

KW - Least squares

KW - Many parameter regression

KW - Second-order efficiency

UR - http://www.scopus.com/inward/record.url?scp=0032533058&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0032533058&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0032533058

VL - 40

SP - 307

EP - 319

JO - Statistics and Probability Letters

JF - Statistics and Probability Letters

SN - 0167-7152

IS - 4

ER -