Prediction weighted maximum frequency selection

Hongmei Liu, J. Sunil Rao

Research output: Contribution to journalArticlepeer-review

Abstract

Shrinkage estimators that possess the ability to produce sparse solutions have become increasingly important to the analysis of today’s complex datasets. Examples include the LASSO, the Elastic-Net and their adaptive counterparts. Estimation of penalty parameters still presents difficulties however. While variable selection consistent procedures have been developed, their finite sample performance can often be less than satisfactory. We develop a new strategy for variable selection using the adaptive LASSO and adaptive Elastic-Net estimators with pn diverging. The basic idea first involves using the trace paths of their LARS solutions to bootstrap estimates of maximum frequency (MF) models conditioned on dimension. Conditioning on dimension effectively mitigates overfitting, however to deal with underfitting, these MFs are then prediction-weighted, and it is shown that not only can consistent model selection be achieved, but that attractive convergence rates can as well, leading to excellent finite sample performance. Detailed numerical studies are carried out on both simulated and real datasets. Extensions to the class of generalized linear models are also detailed. MSC 2010 subject classifications: Primary 62J07.

Original languageEnglish (US)
Pages (from-to)640-681
Number of pages42
JournalElectronic Journal of Statistics
Volume11
Issue number1
DOIs
StatePublished - 2017
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2017, Institute of Mathematical Statistics. All rights reserved.

Keywords

  • Adaptive Elastic-Net
  • Adaptive LASSO
  • Bootstrapping
  • Model selection

Fingerprint

Dive into the research topics of 'Prediction weighted maximum frequency selection'. Together they form a unique fingerprint.

Cite this