Likelihood-based sufficient dimension reduction

R. Dennis Cook, Liliana Forzani

Research output: Contribution to journalArticlepeer-review

117 Scopus citations


We obtain the maximum likelihood estimator of the central subspace under conditional normality of the predictors given the response. Analytically and in simulations we found that our new estimator can preform much better than sliced inverse regression, sliced average variance estimation and directional regression, and that it seems quite robust to deviations from normality.

Original languageEnglish (US)
Pages (from-to)197-208
Number of pages12
JournalJournal of the American Statistical Association
Issue number485
StatePublished - Mar 2009

Bibliographical note

Funding Information:
R. Dennis Cook is Professor, School of Statistics, University of Minnesota, Minneapolis, MN 55455 (E-mail: Liliana Forzani is Professor, Facultad de Ingeniería Química, Universidad Nacional del Litoral and Instituto Matemática Aplicada del Litoral, CONICET, Güemes 3450, (3000) Santa Fe, Argentina (E-mail: Part of this work was completed while both authors were in residence at the Isaac Newton Institute for Mathematical Sciences, Cambridge, UK. Research for this article was supported in part by grant DMS-0704098 from the U.S. National Science Foundation. The authors are grateful to Bing Li, Penn State University, for providing his directional regression code; to Marcela Morvidone from the Lutheries team, Acoustique et Musique of the Institut Jean Le Rond D’Alembert-Universite Pierre et Marie Curie, Paris, for providing the data for the birds–cars–planes illustration; and to the editor for his proactive efforts.


  • Central subspace
  • Directional regression
  • Grassmann manifolds
  • Sliced average variance estimation
  • Sliced inverse regression


Dive into the research topics of 'Likelihood-based sufficient dimension reduction'. Together they form a unique fingerprint.

Cite this