Sparse Minimum Discrepancy Approach to Sufficient Dimension Reduction with Simultaneous Variable Selection in Ultrahigh Dimension

Wei Qian, Shanshan Ding, R. Dennis Cook

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Sufficient dimension reduction (SDR) is known to be a powerful tool for achieving data reduction and data visualization in regression and classification problems. In this work, we study ultrahigh-dimensional SDR problems and propose solutions under a unified minimum discrepancy approach with regularization. When p grows exponentially with n, consistency results in both central subspace estimation and variable selection are established simultaneously for important SDR methods, including sliced inverse regression (SIR), principal fitted component (PFC), and sliced average variance estimation (SAVE). Special sparse structures of large predictor or error covariance are also considered for potentially better performance. In addition, the proposed approach is equipped with a new algorithm to efficiently solve the regularized objective functions and a new data-driven procedure to determine structural dimension and tuning parameters, without the need to invert a large covariance matrix. Simulations and a real data analysis are offered to demonstrate the promise of our proposal in ultrahigh-dimensional settings. Supplementary materials for this article are available online.

Original languageEnglish (US)
Pages (from-to)1277-1290
Number of pages14
JournalJournal of the American Statistical Association
Volume114
Issue number527
DOIs
StatePublished - Jul 3 2019

Keywords

  • Central subspace
  • Inverse regression
  • Principal fitted component
  • Sliced average variance estimation
  • Sliced inverse regression
  • Sparsity

Fingerprint Dive into the research topics of 'Sparse Minimum Discrepancy Approach to Sufficient Dimension Reduction with Simultaneous Variable Selection in Ultrahigh Dimension'. Together they form a unique fingerprint.

Cite this