TY - GEN
T1 - Sparsity control for robust principal component analysis
AU - Mateos, Gonzalo
AU - Giannakis, Georgios B
PY - 2010
Y1 - 2010
N2 - Principal component analysis (PCA) is widely used for high-dimensional data analysis, with well-documented applications in computer vision, preference measurement, and bioinformatics. In this context, the fresh look advocated here permeates benefits from variable selection and compressive sampling, to robustify PCA against outliers. A least-trimmed squares estimator of a low-rank component analysis model is shown closely related to that obtained from an ℓ0-(pseudo)norm-regularized criterion encouraging sparsity in a matrix explicitly modeling the outliers. This connection suggests efficient (approximate) solvers based on convex relaxation, which lead naturally to a family of robust estimators subsuming Huber's optimal M-class. Outliers are identified by tuning a regularization parameter, which amounts to controlling the sparsity of the outlier matrix along the whole robustification path of (group)-Lasso solutions. Novel algorithms are developed to: i) estimate the low-rank data model both robustly and adaptively; and ii) determine principal components robustly in (possibly) infinite-dimensional feature spaces. Numerical tests corroborate the effectiveness of the proposed robust PCA scheme for a video surveillance task.
AB - Principal component analysis (PCA) is widely used for high-dimensional data analysis, with well-documented applications in computer vision, preference measurement, and bioinformatics. In this context, the fresh look advocated here permeates benefits from variable selection and compressive sampling, to robustify PCA against outliers. A least-trimmed squares estimator of a low-rank component analysis model is shown closely related to that obtained from an ℓ0-(pseudo)norm-regularized criterion encouraging sparsity in a matrix explicitly modeling the outliers. This connection suggests efficient (approximate) solvers based on convex relaxation, which lead naturally to a family of robust estimators subsuming Huber's optimal M-class. Outliers are identified by tuning a regularization parameter, which amounts to controlling the sparsity of the outlier matrix along the whole robustification path of (group)-Lasso solutions. Novel algorithms are developed to: i) estimate the low-rank data model both robustly and adaptively; and ii) determine principal components robustly in (possibly) infinite-dimensional feature spaces. Numerical tests corroborate the effectiveness of the proposed robust PCA scheme for a video surveillance task.
UR - http://www.scopus.com/inward/record.url?scp=79957992016&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=79957992016&partnerID=8YFLogxK
U2 - 10.1109/ACSSC.2010.5757875
DO - 10.1109/ACSSC.2010.5757875
M3 - Conference contribution
AN - SCOPUS:79957992016
SN - 9781424497218
T3 - Conference Record - Asilomar Conference on Signals, Systems and Computers
SP - 1925
EP - 1929
BT - Conference Record of the 44th Asilomar Conference on Signals, Systems and Computers, Asilomar 2010
T2 - 44th Asilomar Conference on Signals, Systems and Computers, Asilomar 2010
Y2 - 7 November 2010 through 10 November 2010
ER -