Sparse sliced inverse regression for high dimensional data analysis

Haileab Hilafu, Sandra E. Safo

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Background: Dimension reduction and variable selection play a critical role in the analysis of contemporary high-dimensional data. The semi-parametric multi-index model often serves as a reasonable model for analysis of such high-dimensional data. The sliced inverse regression (SIR) method, which can be formulated as a generalized eigenvalue decomposition problem, offers a model-free estimation approach for the indices in the semi-parametric multi-index model. Obtaining sparse estimates of the eigenvectors that constitute the basis matrix that is used to construct the indices is desirable to facilitate variable selection, which in turn facilitates interpretability and model parsimony. Results: To this end, we propose a group-Dantzig selector type formulation that induces row-sparsity to the sliced inverse regression dimension reduction vectors. Extensive simulation studies are carried out to assess the performance of the proposed method, and compare it with other state of the art methods in the literature. Conclusion: The proposed method is shown to yield competitive estimation, prediction, and variable selection performance. Three real data applications, including a metabolomics depression study, are presented to demonstrate the method’s effectiveness in practice.

Original languageEnglish (US)
Article number168
JournalBMC bioinformatics
Volume23
Issue number1
DOIs
StatePublished - Dec 2022

Bibliographical note

Publisher Copyright:
© 2022, The Author(s).

Keywords

  • Generalized eigenvalue decomposition
  • High-dimensional data
  • Linear discriminant analysis
  • Semiparametric model
  • Sliced inverse regression

PubMed: MeSH publication types

  • Journal Article

Fingerprint

Dive into the research topics of 'Sparse sliced inverse regression for high dimensional data analysis'. Together they form a unique fingerprint.

Cite this