Abstract
In this paper, we propose distributed algorithms to perform sparse principal component analysis (SPCA). The key benefit of the proposed algorithms is their ability to handle distributed data sets. Our algorithms are able to handle a few sparse-promoting regularizers (i.e., the convex norm and the nonconvex log-sum penalty) as well as different forms of data partition (i.e., partition across rows or columns of the data matrix). Our methods are based on a nonconvex ADMM framework, and they are shown to converge to stationary solutions of various nonconvex SPCA formulations. Numerical experiments based on both real and synthetic data sets, conducted on high performance computing (HPC) clusters, demonstrate the effectiveness of our approaches.
Original language | English (US) |
---|---|
Title of host publication | 2015 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2015 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 255-259 |
Number of pages | 5 |
ISBN (Electronic) | 9781479975914 |
DOIs | |
State | Published - Feb 23 2016 |
Event | IEEE Global Conference on Signal and Information Processing, GlobalSIP 2015 - Orlando, United States Duration: Dec 13 2015 → Dec 16 2015 |
Publication series
Name | 2015 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2015 |
---|
Other
Other | IEEE Global Conference on Signal and Information Processing, GlobalSIP 2015 |
---|---|
Country/Territory | United States |
City | Orlando |
Period | 12/13/15 → 12/16/15 |
Bibliographical note
Publisher Copyright:© 2015 IEEE.
Keywords
- Distributed Optimization
- Non-Convex ADMM
- Sparse PCA