Abstract
Sliced inverse regression (SIR) is an innovative and effective method for dimension reduction and data visualization of high-dimensional problems. It replaces the original variables with low-dimensional linear combinations of predictors without any loss of regression information and without the need to prespecify a model or an error distribution. However, it suffers from the fact that each SIR component is a linear combination of all the original predictors; thus, it is often difficult to interpret the extracted components. By representing SIR as a regression-type optimization problem, we propose in this article a new method, called sparse SIR, that combines the shrinkage idea of the lasso with SIR to produce both accurate and sparse solutions. The efficacy of the proposed method is verified by simulation, and a real data example is given.
Original language | English (US) |
---|---|
Pages | 503-510 |
Number of pages | 8 |
Volume | 48 |
No | 4 |
Specialist publication | Technometrics |
DOIs | |
State | Published - Nov 2006 |
Bibliographical note
Funding Information:Li’s research was supported by NIH grant ES11269 when he was a postdoctoral researcher in 2004 at University of California, Davis. The authors thank the editor, the associate editor, and two anonymous referees for comments and suggestions that led to significant improvements in this article.
Keywords
- Lasso
- Regression shrinkage
- Sliced inverse regression
- Sufficient dimension reduction