Abstract
In this paper we consider nonconvex optimization and learning over a network of distributed nodes. We develop a Proximal Primal-Dual Algorithm (Prox-PDA), which enables the network nodes to distributedly and collectively compute the set of first-order stationary solutions in a global sublinear manner [with a rate of C(1/r), where r is the iteration counter]. To the best of our knowledge, this is the first algorithm that enables distributed nonconvex optimization with global sublincar rate guarantees. Our numerical experiments also demonstrate the effectiveness of the proposed algorithm.
Original language | English (US) |
---|---|
Title of host publication | 34th International Conference on Machine Learning, ICML 2017 |
Publisher | International Machine Learning Society (IMLS) |
Pages | 2402-2433 |
Number of pages | 32 |
ISBN (Electronic) | 9781510855144 |
State | Published - 2017 |
Event | 34th International Conference on Machine Learning, ICML 2017 - Sydney, Australia Duration: Aug 6 2017 → Aug 11 2017 |
Publication series
Name | 34th International Conference on Machine Learning, ICML 2017 |
---|---|
Volume | 4 |
Other
Other | 34th International Conference on Machine Learning, ICML 2017 |
---|---|
Country/Territory | Australia |
City | Sydney |
Period | 8/6/17 → 8/11/17 |
Bibliographical note
Publisher Copyright:© Copyright 2017 by the author(s).