Abstract
We present first methodology for dimension reduction in regressions with predictors that, given the response, follow one-parameter exponential families. Our approach is based on modeling the conditional distribution of the predictors given the response, which allows us to derive and estimate a sufficient reduction of the predictors. We also propose a method of estimating the forward regression mean function without requiring an explicit forward regression model. Whereas nearly all existing estimators of the central subspace are limited to regressions with continuous predictors only, our proposed methodology extends estimation to regressions with all categorical or a mixture of categorical and continuous predictors. Supplementary materials including the proofs and the computer code are available from the JCGS website.
Original language | English |
---|---|
Pages (from-to) | 774-791 |
Number of pages | 18 |
Journal | Journal of Computational and Graphical Statistics |
Volume | 18 |
Issue number | 3 |
DOIs | |
State | Published - 2009 |
Bibliographical note
Funding Information:Research for this article was supported in part by National Science Foundation Grant DMS-0405360 awarded to RDC, and Grant DMS-0706919 awarded to LL.
Publisher Copyright:
© 2009 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.
Keywords
- Central subspace
- Grassmann manifolds
- Inverse regression
- Sufficient dimension reduction