A Sharper Computational Tool for Regression

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Building on previous research of Chi and Chi, this article revisits estimation in robust structured regression under the (Formula presented.) criterion. We adopt the majorization-minimization (MM) principle to design a new algorithm for updating the vector of regression coefficients. Our sharp majorization achieves faster convergence than the previous alternating proximal gradient descent algorithm by Chi and Chi. In addition, we reparameterize the model by substituting precision for scale and estimate precision via a modified Newton’s method. This simplifies and accelerates overall estimation. We also introduce distance-to-set penalties to enable constrained estimation under nonconvex constraint sets. This tactic also improves performance in coefficient estimation and structure recovery. Finally, we demonstrate the merits of our improved tactics through a rich set of simulation examples and a real data application.

Original languageEnglish (US)
Pages (from-to)117-126
Number of pages10
JournalTechnometrics
Volume65
Issue number1
DOIs
StatePublished - 2023
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2022 American Statistical Association and the American Society for Quality.

Keywords

  • Distance penalization
  • Integral squared error criterion
  • MM principle
  • Newton’s method
  • Penalized estimation

Fingerprint

Dive into the research topics of 'A Sharper Computational Tool for Regression'. Together they form a unique fingerprint.

Cite this