A well-tempered landscape for non-convex robust subspace recovery

Tyler Maunu, Teng Zhang, Gilad Lerman

Research output: Contribution to journalArticlepeer-review

29 Scopus citations

Abstract

We present a mathematical analysis of a non-convex energy landscape for robust subspace recovery. We prove that an underlying subspace is the only stationary point and local minimizer in a specified neighborhood under a deterministic condition on a dataset. If the deterministic condition is satisfied, we further show that a geodesic gradient descent method over the Grassmannian manifold can exactly recover the underlying subspace when the method is properly initialized. Proper initialization by principal component analysis is guaranteed with a simple deterministic condition. Under slightly stronger assumptions, the gradient descent method with a piecewise constant step-size scheme achieves linear convergence. The practicality of the deterministic condition is demonstrated on some statistical models of data, and the method achieves almost state-of-the-art recovery guarantees on the Haystack Model for different regimes of sample size and ambient dimension. In particular, when the ambient dimension is fixed and the sample size is large enough, we show that our gradient method can exactly recover the underlying subspace for any fixed fraction of outliers (less than 1).

Original languageEnglish (US)
JournalJournal of Machine Learning Research
Volume20
StatePublished - Feb 1 2019

Bibliographical note

Publisher Copyright:
© 2019 Tyler Maunu, Teng Zhang and Gilad Lerman.

Keywords

  • Dimension reduction
  • Non-convex optimization
  • Optimization on the Grassmannian
  • Robust subspace recovery

Fingerprint

Dive into the research topics of 'A well-tempered landscape for non-convex robust subspace recovery'. Together they form a unique fingerprint.

Cite this