Cross-validation for selecting a model selection procedure

Yongli Zhang, Yuhong Yang

Research output: Contribution to journalArticlepeer-review

161 Scopus citations

Abstract

While there are various model selection methods, an unanswered but important question is how to select one of them for data at hand. The difficulty is due to that the targeted behaviors of the model selection procedures depend heavily on uncheckable or difficult-to-check assumptions on the data generating process. Fortunately, cross-validation (CV) provides a general tool to solve this problem. In this work, results are provided on how to apply CV to consistently choose the best method, yielding new insights and guidance for potentially vast amount of application. In addition, we address several seemingly widely spread misconceptions on CV.

Original languageEnglish (US)
Pages (from-to)95-112
Number of pages18
JournalJournal of Econometrics
Volume187
Issue number1
DOIs
StatePublished - Jul 1 2015

Bibliographical note

Funding Information:
We thank two anonymous referees, the Associate Editor and the Editor, Dr. Yacine Ait-Sahalia, for providing us with very insightful comments and valuable suggestions to improve the paper. The research of Yuhong Yang was partially supported by the United States National Science Foundation Grant DMS-1106576 .

Publisher Copyright:
© 2015 Elsevier B.V.

Keywords

  • Adaptive procedure selection
  • Cross-validation
  • Cross-validation paradox
  • Data splitting ratio
  • Information criterion
  • LASSO
  • MCP
  • SCAD

Fingerprint

Dive into the research topics of 'Cross-validation for selecting a model selection procedure'. Together they form a unique fingerprint.

Cite this