Optimal sufficient dimension reduction in regressions with categorical predictors

Xuerong Wen, R. Dennis Cook

Research output: Contribution to journalArticlepeer-review

27 Scopus citations

Abstract

Though partial sliced inverse regression (partial SIR: Chiaromonte et al. [2002. Sufficient dimension reduction in regressions with categorical predictors. Ann. Statist. 30, 475-497]) extended the scope of sufficient dimension reduction to regressions with both continuous and categorical predictors, its requirement of homogeneous predictor covariances across the subpopulations restricts its application in practice. When this condition fails, partial SIR may provide misleading results. In this article, we propose a new estimation method via a minimum discrepancy approach without this restriction. Our method is optimal in terms of asymptotic efficiency and its test statistic for testing the dimension of the partial central subspace always has an asymptotic chi-squared distribution. It also gives us the ability to test predictor effects. An asymptotic chi-squared test of the conditional independence hypothesis that the response is independent of a selected subset of the continuous predictors given the remaining predictors is obtained.

Original languageEnglish (US)
Pages (from-to)1961-1978
Number of pages18
JournalJournal of Statistical Planning and Inference
Volume137
Issue number6
DOIs
StatePublished - Jun 1 2007

Bibliographical note

Copyright:
Copyright 2007 Elsevier B.V., All rights reserved.

Keywords

  • Inverse regression
  • Minimum discrepancy approach
  • Partial SIR
  • Partial central subspace

Fingerprint

Dive into the research topics of 'Optimal sufficient dimension reduction in regressions with categorical predictors'. Together they form a unique fingerprint.

Cite this