### Abstract

Consider a data set of vector-valued observations that consists of noisy inliers, which are explained well by a low-dimensional subspace, along with some number of outliers. This work describes a convex optimization problem, called reaper, that can reliably fit a low-dimensional model to this type of data. This approach parameterizes linear subspaces using orthogonal projectors and uses a relaxation of the set of orthogonal projectors to reach the convex formulation. The paper provides an efficient algorithm for solving the reaper problem, and it documents numerical experiments that confirm that reaper can dependably find linear structure in synthetic and natural data. In addition, when the inliers lie near a low-dimensional subspace, there is a rigorous theory that describes when reaper can approximate this subspace.

Original language | English (US) |
---|---|

Pages (from-to) | 363-410 |

Number of pages | 48 |

Journal | Foundations of Computational Mathematics |

Volume | 15 |

Issue number | 2 |

DOIs | |

State | Published - 2015 |

### Keywords

- Convex relaxation
- Iteratively reweighted least squares
- Robust linear models

## Fingerprint Dive into the research topics of 'Robust Computation of Linear Models by Convex Relaxation'. Together they form a unique fingerprint.

## Cite this

*Foundations of Computational Mathematics*,

*15*(2), 363-410. https://doi.org/10.1007/s10208-014-9221-0