Prox-PDA: The proximal primal-dual algorithm for fast distributed nonconvex optimization and learning over networks

Mingyi Hong, Davood Hajinezhad, Ming Min Zhao

Research output: Chapter in Book/Report/Conference proceedingConference contribution

40 Scopus citations

Abstract

In this paper we consider nonconvex optimization and learning over a network of distributed nodes. We develop a Proximal Primal-Dual Algorithm (Prox-PDA), which enables the network nodes to distributedly and collectively compute the set of first-order stationary solutions in a global sublinear manner [with a rate of C(1/r), where r is the iteration counter]. To the best of our knowledge, this is the first algorithm that enables distributed nonconvex optimization with global sublincar rate guarantees. Our numerical experiments also demonstrate the effectiveness of the proposed algorithm.

Original languageEnglish (US)
Title of host publication34th International Conference on Machine Learning, ICML 2017
PublisherInternational Machine Learning Society (IMLS)
Pages2402-2433
Number of pages32
ISBN (Electronic)9781510855144
StatePublished - 2017
Event34th International Conference on Machine Learning, ICML 2017 - Sydney, Australia
Duration: Aug 6 2017Aug 11 2017

Publication series

Name34th International Conference on Machine Learning, ICML 2017
Volume4

Other

Other34th International Conference on Machine Learning, ICML 2017
Country/TerritoryAustralia
CitySydney
Period8/6/178/11/17

Bibliographical note

Publisher Copyright:
© Copyright 2017 by the author(s).

Fingerprint

Dive into the research topics of 'Prox-PDA: The proximal primal-dual algorithm for fast distributed nonconvex optimization and learning over networks'. Together they form a unique fingerprint.

Cite this