NESTT: A nonconvex primal-dual splitting method for distributed and stochastic optimization

Davood Hajinezhad, Mingyi Hong, Tuo Zhao, Zhaoran Wang

Research output: Contribution to journalConference articlepeer-review

30 Scopus citations


We study a stochastic and distributed algorithm for nonconvex problems whose objective consists of a sum of N nonconvex Li/N-smooth functions, plus a non-smooth regularizer. The proposed NonconvEx primal-dual SpliTTing (NESTT) algorithm splits the problem into N subproblems, and utilizes an augmented Lagrangian based primal-dual scheme to solve it in a distributed and stochastic manner. With a special non-uniform sampling, a version of NESTT achieves ϵ-stationary solution using O((ΣNi=1 √Li/N)2/ϵ) gradient evaluations, which can be up to O(N) times better than the (proximal) gradient descent methods. It also achieves Q-linear convergence rate for nonconvex l1 penalized quadratic problems with polyhedral constraints. Further, we reveal a fundamental connection between primal-dual based methods and a few primal only methods such as IAG/SAG/SAGA.

Original languageEnglish (US)
Pages (from-to)3215-3223
Number of pages9
JournalAdvances in Neural Information Processing Systems
StatePublished - 2016
Externally publishedYes
Event30th Annual Conference on Neural Information Processing Systems, NIPS 2016 - Barcelona, Spain
Duration: Dec 5 2016Dec 10 2016

Bibliographical note

Publisher Copyright:
© 2016 NIPS Foundation - All Rights Reserved.


Dive into the research topics of 'NESTT: A nonconvex primal-dual splitting method for distributed and stochastic optimization'. Together they form a unique fingerprint.

Cite this