Robust stochastic principal component analysis

John Goes, Teng Zhang, Raman Arora, Gilad Lerman

Research output: Contribution to journalConference articlepeer-review

31 Scopus citations


We consider the problem of finding lower dimensional subspaces in the presence of outliers and noise in the online setting. In particular, we extend previous batch formulations of robust PCA to the stochastic setting with minimal storage requirements and runtime complexity. We introduce three novel stochastic approximation algorithms for robust PCA that are extensions of standard algorithms for PCA - the stochastic power method, incremental PCA and online PCA using matrix-exponentiated-gradient (MEG) updates. For robust online PCA we also give a sub-linear convergence guarantee. Our numerical results demonstrate the superiority of the the robust online method over the other robust stochastic methods and the advantage of robust methods over their non-robust counterparts in the presence of outliers in artificial and real scenarios.

Original languageEnglish (US)
Pages (from-to)266-274
Number of pages9
JournalJournal of Machine Learning Research
StatePublished - 2014
Event17th International Conference on Artificial Intelligence and Statistics, AISTATS 2014 - Reykjavik, Iceland
Duration: Apr 22 2014Apr 25 2014


Dive into the research topics of 'Robust stochastic principal component analysis'. Together they form a unique fingerprint.

Cite this