Abstract
In this paper, we propose a perturbed proximal primal–dual algorithm (PProx-PDA) for an important class of linearly constrained optimization problems, whose objective is the sum of smooth (possibly nonconvex) and convex (possibly nonsmooth) functions. This family of problems can be used to model many statistical and engineering applications, such as high-dimensional subspace estimation and the distributed machine learning. The proposed method is of the Uzawa type, in which a primal gradient descent step is performed followed by an (approximate) dual gradient ascent step. One distinctive feature of the proposed algorithm is that the primal and dual steps are both perturbed appropriately using past iterates so that a number of asymptotic convergence and rate of convergence results (to first-order stationary solutions) can be obtained. Finally, we conduct extensive numerical experiments to validate the effectiveness of the proposed algorithm.
Original language | English (US) |
---|---|
Pages (from-to) | 207-245 |
Number of pages | 39 |
Journal | Mathematical Programming |
Volume | 176 |
Issue number | 1-2 |
DOIs | |
State | Published - Jul 1 2019 |
Bibliographical note
Funding Information:This work was completed when Davood Hajinezhad was a Ph.D. student at Iowa State University. Mingyi Hong is supported by NSF Grants CMMI-1727757, and an AFOSR Grant 15RT0767.
Publisher Copyright:
© 2019, Springer-Verlag GmbH Germany, part of Springer Nature and Mathematical Optimization Society.
Keywords
- Nonconvex optimization
- Nonsmooth optimization
- Primal–dual algorithms