Robust Computation of Linear Models by Convex Relaxation

Gilad Lerman, Michael B. McCoy, Joel A. Tropp, Teng Zhang

Research output: Contribution to journalArticlepeer-review

96 Scopus citations


Consider a data set of vector-valued observations that consists of noisy inliers, which are explained well by a low-dimensional subspace, along with some number of outliers. This work describes a convex optimization problem, called reaper, that can reliably fit a low-dimensional model to this type of data. This approach parameterizes linear subspaces using orthogonal projectors and uses a relaxation of the set of orthogonal projectors to reach the convex formulation. The paper provides an efficient algorithm for solving the reaper problem, and it documents numerical experiments that confirm that reaper can dependably find linear structure in synthetic and natural data. In addition, when the inliers lie near a low-dimensional subspace, there is a rigorous theory that describes when reaper can approximate this subspace.

Original languageEnglish (US)
Pages (from-to)363-410
Number of pages48
JournalFoundations of Computational Mathematics
Issue number2
StatePublished - Apr 2015

Bibliographical note

Publisher Copyright:
© 2014, SFoCM.


  • Convex relaxation
  • Iteratively reweighted least squares
  • Robust linear models


Dive into the research topics of 'Robust Computation of Linear Models by Convex Relaxation'. Together they form a unique fingerprint.

Cite this