Fisher lecture: Dimension reduction in regression

R. Dennis Cook

Research output: Contribution to journalArticlepeer-review

230 Scopus citations

Abstract

Beginning with a discussion of R. A. Fisher's early written remarks that relate to dimension reduction, this article revisits principal components as a reductive method in regression, develops several model-based extensions and ends with descriptions of general approaches to model-based and model-free dimension reduction in regression. It is argued that the role for principal components and related methodology may be broader than previously seen and that the common practice of conditioning on observed values of the predictors may unnecessarily limit the choice of regression methodology.

Original languageEnglish (US)
Pages (from-to)1-26
Number of pages26
JournalStatistical Science
Volume22
Issue number1
DOIs
StatePublished - Feb 2007

Keywords

  • Central subspace
  • Grassmann manifolds
  • Inverse regression
  • Minimum average variance estimation
  • Principal components
  • Principal fitted components
  • Sliced inverse regression
  • Sufficient dimension reduction

Fingerprint

Dive into the research topics of 'Fisher lecture: Dimension reduction in regression'. Together they form a unique fingerprint.

Cite this