Perturbed proximal primal–dual algorithm for nonconvex nonsmooth optimization

Davood Hajinezhad, Mingyi Hong

Research output: Contribution to journalArticlepeer-review

34 Scopus citations


In this paper, we propose a perturbed proximal primal–dual algorithm (PProx-PDA) for an important class of linearly constrained optimization problems, whose objective is the sum of smooth (possibly nonconvex) and convex (possibly nonsmooth) functions. This family of problems can be used to model many statistical and engineering applications, such as high-dimensional subspace estimation and the distributed machine learning. The proposed method is of the Uzawa type, in which a primal gradient descent step is performed followed by an (approximate) dual gradient ascent step. One distinctive feature of the proposed algorithm is that the primal and dual steps are both perturbed appropriately using past iterates so that a number of asymptotic convergence and rate of convergence results (to first-order stationary solutions) can be obtained. Finally, we conduct extensive numerical experiments to validate the effectiveness of the proposed algorithm.

Original languageEnglish (US)
Pages (from-to)207-245
Number of pages39
JournalMathematical Programming
Issue number1-2
StatePublished - Jul 1 2019
Externally publishedYes

Bibliographical note

Funding Information:
This work was completed when Davood Hajinezhad was a Ph.D. student at Iowa State University. Mingyi Hong is supported by NSF Grants CMMI-1727757, and an AFOSR Grant 15RT0767.

Publisher Copyright:
© 2019, Springer-Verlag GmbH Germany, part of Springer Nature and Mathematical Optimization Society.


  • Nonconvex optimization
  • Nonsmooth optimization
  • Primal–dual algorithms


Dive into the research topics of 'Perturbed proximal primal–dual algorithm for nonconvex nonsmooth optimization'. Together they form a unique fingerprint.

Cite this