Abstract
While there are various model selection methods, an unanswered but important question is how to select one of them for data at hand. The difficulty is due to that the targeted behaviors of the model selection procedures depend heavily on uncheckable or difficult-to-check assumptions on the data generating process. Fortunately, cross-validation (CV) provides a general tool to solve this problem. In this work, results are provided on how to apply CV to consistently choose the best method, yielding new insights and guidance for potentially vast amount of application. In addition, we address several seemingly widely spread misconceptions on CV.
Original language | English (US) |
---|---|
Pages (from-to) | 95-112 |
Number of pages | 18 |
Journal | Journal of Econometrics |
Volume | 187 |
Issue number | 1 |
DOIs | |
State | Published - Jul 1 2015 |
Bibliographical note
Funding Information:We thank two anonymous referees, the Associate Editor and the Editor, Dr. Yacine Ait-Sahalia, for providing us with very insightful comments and valuable suggestions to improve the paper. The research of Yuhong Yang was partially supported by the United States National Science Foundation Grant DMS-1106576 .
Publisher Copyright:
© 2015 Elsevier B.V.
Keywords
- Adaptive procedure selection
- Cross-validation
- Cross-validation paradox
- Data splitting ratio
- Information criterion
- LASSO
- MCP
- SCAD