First-Order Algorithms for Convex Optimization with Nonseparable Objective and Coupled Constraints

Xiang Gao, Shu Zhong Zhang

Research output: Contribution to journalArticlepeer-review

18 Scopus citations

Abstract

In this paper, we consider a block-structured convex optimization model, where in the objective the block variables are nonseparable and they are further linearly coupled in the constraint. For the 2-block case, we propose a number of first-order algorithms to solve this model. First, the alternating direction method of multipliers (ADMM) is extended, assuming that it is easy to optimize the augmented Lagrangian function with one block of variables at each time while fixing the other block. We prove that [InlineEquation not available: see fulltext.] iteration complexity bound holds under suitable conditions, where t is the number of iterations. If the subroutines of the ADMM cannot be implemented, then we propose new alternative algorithms to be called alternating proximal gradient method of multipliers, alternating gradient projection method of multipliers, and the hybrids thereof. Under suitable conditions, the [InlineEquation not available: see fulltext.] iteration complexity bound is shown to hold for all the newly proposed algorithms. Finally, we extend the analysis for the ADMM to the general multi-block case.

Original languageEnglish (US)
Pages (from-to)131-159
Number of pages29
JournalJournal of the Operations Research Society of China
Volume5
Issue number2
DOIs
StatePublished - Jun 1 2017

Keywords

  • ADMM
  • Convex optimization
  • First-order algorithms
  • Iteration complexity
  • Proximal gradient method

Fingerprint Dive into the research topics of 'First-Order Algorithms for Convex Optimization with Nonseparable Objective and Coupled Constraints'. Together they form a unique fingerprint.

Cite this