Abstract
Beginning with a discussion of R. A. Fisher's early written remarks that relate to dimension reduction, this article revisits principal components as a reductive method in regression, develops several model-based extensions and ends with descriptions of general approaches to model-based and model-free dimension reduction in regression. It is argued that the role for principal components and related methodology may be broader than previously seen and that the common practice of conditioning on observed values of the predictors may unnecessarily limit the choice of regression methodology.
Original language | English (US) |
---|---|
Pages (from-to) | 1-26 |
Number of pages | 26 |
Journal | Statistical Science |
Volume | 22 |
Issue number | 1 |
DOIs | |
State | Published - Feb 2007 |
Keywords
- Central subspace
- Grassmann manifolds
- Inverse regression
- Minimum average variance estimation
- Principal components
- Principal fitted components
- Sliced inverse regression
- Sufficient dimension reduction