Abstract
Sliced inverse regression is a popular tool for sufficient dimension reduction, which replaces covariates with a minimal set of their linear combinations without loss of information on the conditional distribution of the response given the covariates. The estimated linear combinations include all covariates, making results difficult to interpret and perhaps unnecessarily variable, particularly when the number of covariates is large. In this paper, we propose a convex formulation for fitting sparse sliced inverse regression in high dimensions. Our proposal estimates the subspace of the linear combinations of the covariates directly and performs variable selection simultaneously.We solve the resulting convex optimization problem via the linearized alternating direction methods of multiplier algorithm, and establish an upper bound on the subspace distance between the estimated and the true subspaces. Through numerical studies, we show that our proposal is able to identify the correct covariates in the high-dimensional setting.
Original language | English (US) |
---|---|
Pages (from-to) | 769-782 |
Number of pages | 14 |
Journal | Biometrika |
Volume | 105 |
Issue number | 4 |
DOIs | |
State | Published - Dec 1 2018 |
Bibliographical note
Funding Information:This work was partially supported by the National Science Foundation. We thank the editor, an associate editor, and three reviewers for their comments. We thank Lexin Li and Tao Wang for responding to our inquiries and providing the R code.
Publisher Copyright:
© 2018 Biometrika Trust.
Keywords
- Convex optimization
- Dimension reduction
- Nonparametric regression
- Principal fitted component.