Orthogonalizing EM: A Design-Based Least Squares Algorithm

Shifeng Xiong, Bin Dai, Jared Huling, Peter Z.G. Qian

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

We introduce an efficient iterative algorithm, intended for various least squares problems, based on a design of experiments perspective. The algorithm, called orthogonalizing EM (OEM), works for ordinary least squares (OLS) and can be easily extended to penalized least squares. The main idea of the procedure is to orthogonalize a design matrix by adding new rows and then solve the original problem by embedding the augmented design in a missing data framework. We establish several attractive theoretical properties concerning OEM. For the OLS with a singular regression matrix, an OEM sequence converges to the Moore-Penrose generalized inverse-based least squares estimator. For ordinary and penalized least squares with various penalties, it converges to a point having grouping coherence for fully aliased regression matrices. Convergence and the convergence rate of the algorithm are examined. Finally, we demonstrate that OEM is highly efficient for large-scale least squares and penalized least squares problems, and is considerably faster than competing methods when n is much larger than p. Supplementary materials for this article are available online.

Original languageEnglish (US)
Pages (from-to)285-293
Number of pages9
JournalTechnometrics
Volume58
Issue number3
DOIs
StatePublished - Jul 2 2016
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2016 American Statistical Association and the American Society for Quality.

Keywords

  • Computational statistics
  • Design of experiments
  • Missing data
  • Orthogonal design
  • SCAD
  • The Lasso

Fingerprint

Dive into the research topics of 'Orthogonalizing EM: A Design-Based Least Squares Algorithm'. Together they form a unique fingerprint.

Cite this