Abstract
With the advancement of modern technology, array-valued data are often encountered in application. Such data can exhibit both high dimensionality and complex structures. Traditional methods for sufficient dimension reduction (SDR) are generally inefficient for array-valued data as they cannot adequately capture the underlying structure. In this article, we discuss recently developed higher-order approaches to SDR for regressions with matrix- or array-valued predictors, with a special focus on sliced inverse regressions. These methods can reduce an array-valued predictor's multiple dimensions simultaneously without losing much/any information for prediction and classification. We briefly discuss the implementation procedure for each method.
Original language | English (US) |
---|---|
Pages (from-to) | 249-257 |
Number of pages | 9 |
Journal | Wiley Interdisciplinary Reviews: Computational Statistics |
Volume | 7 |
Issue number | 4 |
DOIs | |
State | Published - Jul 2015 |
Keywords
- Dimension folding
- Sliced inverse regression
- Sufficient dimension reduction
- Tensor data