Abstract
We introduce an efficient iterative algorithm, intended for various least squares problems, based on a design of experiments perspective. The algorithm, called orthogonalizing EM (OEM), works for ordinary least squares (OLS) and can be easily extended to penalized least squares. The main idea of the procedure is to orthogonalize a design matrix by adding new rows and then solve the original problem by embedding the augmented design in a missing data framework. We establish several attractive theoretical properties concerning OEM. For the OLS with a singular regression matrix, an OEM sequence converges to the Moore-Penrose generalized inverse-based least squares estimator. For ordinary and penalized least squares with various penalties, it converges to a point having grouping coherence for fully aliased regression matrices. Convergence and the convergence rate of the algorithm are examined. Finally, we demonstrate that OEM is highly efficient for large-scale least squares and penalized least squares problems, and is considerably faster than competing methods when n is much larger than p. Supplementary materials for this article are available online.
Original language | English (US) |
---|---|
Pages (from-to) | 285-293 |
Number of pages | 9 |
Journal | Technometrics |
Volume | 58 |
Issue number | 3 |
DOIs | |
State | Published - Jul 2 2016 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2016 American Statistical Association and the American Society for Quality.
Keywords
- Computational statistics
- Design of experiments
- Missing data
- Orthogonal design
- SCAD
- The Lasso