Bootstrapping likelihood for model selection with small samples

Research output: Contribution to journalArticlepeer-review

22 Scopus citations

Abstract

Akaike’s information criterion (AIC), derived from asymptotics of the maximum likelihood estimator, is widely used in model selection. However, it has a finite-sample bias that produces overfitting in linear regression. To deal with this problem, Ishiguro, Sakamoto, and Kitagawa proposed a bootstrap-based extension to AIC which they called EIC. This article compares model-selection performance of AIC, EIC, a bootstrap-smoothed likelihood cross-validation (BCV) and its modification (632CV) in small-sample linear regression, logistic regression, and Cox regression. Simulation results show that EIC largely overcomes AIC’s overfitting problem and that BCV may be better than EIC. Hence, the three methods based on bootstrapping the likelihood establish themselves as important alternatives to AIC in model selection with small samples.

Original languageEnglish (US)
Pages (from-to)687-698
Number of pages12
JournalJournal of Computational and Graphical Statistics
Volume8
Issue number4
DOIs
StatePublished - Dec 1999

Keywords

  • AIC
  • Cox regression
  • Cross-validation
  • EIC
  • Linear regression
  • Logistic regression
  • Maximum likelihood

Fingerprint

Dive into the research topics of 'Bootstrapping likelihood for model selection with small samples'. Together they form a unique fingerprint.

Cite this