Randomized Primal–Dual Proximal Block Coordinate Updates

Xiang Gao, Yang Yang Xu, Shuzhong Zhang

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

In this paper, we propose a randomized primal–dual proximal block coordinate updating framework for a general multi-block convex optimization model with coupled objective function and linear constraints. Assuming mere convexity, we establish its O(1 / t) convergence rate in terms of the objective value and feasibility measure. The framework includes several existing algorithms as special cases such as a primal–dual method for bilinear saddle-point problems (PD-S), the proximal Jacobian alternating direction method of multipliers (Prox-JADMM) and a randomized variant of the ADMM for multi-block convex optimization. Our analysis recovers and/or strengthens the convergence properties of several existing algorithms. For example, for PD-S our result leads to the same order of convergence rate without the previously assumed boundedness condition on the constraint sets, and for Prox-JADMM the new result provides convergence rate in terms of the objective value and the feasibility violation. It is well known that the original ADMM may fail to converge when the number of blocks exceeds two. Our result shows that if an appropriate randomization procedure is invoked to select the updating blocks, then a sublinear rate of convergence in expectation can be guaranteed for multi-block ADMM, without assuming any strong convexity. The new approach is also extended to solve problems where only a stochastic approximation of the subgradient of the objective is available, and we establish an O(1/t) convergence rate of the extended approach for solving stochastic programming.

Original languageEnglish (US)
Pages (from-to)205-250
Number of pages46
JournalJournal of the Operations Research Society of China
Volume7
Issue number2
DOIs
StatePublished - Jun 5 2019

Fingerprint

Rate of convergence
Multiplier
Convexity
Convergence rate
Stochastic approximation
Objective function
Saddlepoint
Optimization model
Violations
Randomization
Stochastic programming

Keywords

  • Alternating direction method of multipliers (ADMM)
  • First-order stochastic approximation
  • Iteration complexity
  • Primal–dual method
  • Randomized algorithm

Cite this

Randomized Primal–Dual Proximal Block Coordinate Updates. / Gao, Xiang; Xu, Yang Yang; Zhang, Shuzhong.

In: Journal of the Operations Research Society of China, Vol. 7, No. 2, 05.06.2019, p. 205-250.

Research output: Contribution to journalArticle

@article{2629fe8d900c4060896cc9da9acd126c,
title = "Randomized Primal–Dual Proximal Block Coordinate Updates",
abstract = "In this paper, we propose a randomized primal–dual proximal block coordinate updating framework for a general multi-block convex optimization model with coupled objective function and linear constraints. Assuming mere convexity, we establish its O(1 / t) convergence rate in terms of the objective value and feasibility measure. The framework includes several existing algorithms as special cases such as a primal–dual method for bilinear saddle-point problems (PD-S), the proximal Jacobian alternating direction method of multipliers (Prox-JADMM) and a randomized variant of the ADMM for multi-block convex optimization. Our analysis recovers and/or strengthens the convergence properties of several existing algorithms. For example, for PD-S our result leads to the same order of convergence rate without the previously assumed boundedness condition on the constraint sets, and for Prox-JADMM the new result provides convergence rate in terms of the objective value and the feasibility violation. It is well known that the original ADMM may fail to converge when the number of blocks exceeds two. Our result shows that if an appropriate randomization procedure is invoked to select the updating blocks, then a sublinear rate of convergence in expectation can be guaranteed for multi-block ADMM, without assuming any strong convexity. The new approach is also extended to solve problems where only a stochastic approximation of the subgradient of the objective is available, and we establish an O(1/t) convergence rate of the extended approach for solving stochastic programming.",
keywords = "Alternating direction method of multipliers (ADMM), First-order stochastic approximation, Iteration complexity, Primal–dual method, Randomized algorithm",
author = "Xiang Gao and Xu, {Yang Yang} and Shuzhong Zhang",
year = "2019",
month = "6",
day = "5",
doi = "10.1007/s40305-018-0232-4",
language = "English (US)",
volume = "7",
pages = "205--250",
journal = "Journal of the Operations Research Society of China",
issn = "2194-668X",
publisher = "Springer Science + Business Media",
number = "2",

}

TY - JOUR

T1 - Randomized Primal–Dual Proximal Block Coordinate Updates

AU - Gao, Xiang

AU - Xu, Yang Yang

AU - Zhang, Shuzhong

PY - 2019/6/5

Y1 - 2019/6/5

N2 - In this paper, we propose a randomized primal–dual proximal block coordinate updating framework for a general multi-block convex optimization model with coupled objective function and linear constraints. Assuming mere convexity, we establish its O(1 / t) convergence rate in terms of the objective value and feasibility measure. The framework includes several existing algorithms as special cases such as a primal–dual method for bilinear saddle-point problems (PD-S), the proximal Jacobian alternating direction method of multipliers (Prox-JADMM) and a randomized variant of the ADMM for multi-block convex optimization. Our analysis recovers and/or strengthens the convergence properties of several existing algorithms. For example, for PD-S our result leads to the same order of convergence rate without the previously assumed boundedness condition on the constraint sets, and for Prox-JADMM the new result provides convergence rate in terms of the objective value and the feasibility violation. It is well known that the original ADMM may fail to converge when the number of blocks exceeds two. Our result shows that if an appropriate randomization procedure is invoked to select the updating blocks, then a sublinear rate of convergence in expectation can be guaranteed for multi-block ADMM, without assuming any strong convexity. The new approach is also extended to solve problems where only a stochastic approximation of the subgradient of the objective is available, and we establish an O(1/t) convergence rate of the extended approach for solving stochastic programming.

AB - In this paper, we propose a randomized primal–dual proximal block coordinate updating framework for a general multi-block convex optimization model with coupled objective function and linear constraints. Assuming mere convexity, we establish its O(1 / t) convergence rate in terms of the objective value and feasibility measure. The framework includes several existing algorithms as special cases such as a primal–dual method for bilinear saddle-point problems (PD-S), the proximal Jacobian alternating direction method of multipliers (Prox-JADMM) and a randomized variant of the ADMM for multi-block convex optimization. Our analysis recovers and/or strengthens the convergence properties of several existing algorithms. For example, for PD-S our result leads to the same order of convergence rate without the previously assumed boundedness condition on the constraint sets, and for Prox-JADMM the new result provides convergence rate in terms of the objective value and the feasibility violation. It is well known that the original ADMM may fail to converge when the number of blocks exceeds two. Our result shows that if an appropriate randomization procedure is invoked to select the updating blocks, then a sublinear rate of convergence in expectation can be guaranteed for multi-block ADMM, without assuming any strong convexity. The new approach is also extended to solve problems where only a stochastic approximation of the subgradient of the objective is available, and we establish an O(1/t) convergence rate of the extended approach for solving stochastic programming.

KW - Alternating direction method of multipliers (ADMM)

KW - First-order stochastic approximation

KW - Iteration complexity

KW - Primal–dual method

KW - Randomized algorithm

UR - http://www.scopus.com/inward/record.url?scp=85066468899&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85066468899&partnerID=8YFLogxK

U2 - 10.1007/s40305-018-0232-4

DO - 10.1007/s40305-018-0232-4

M3 - Article

VL - 7

SP - 205

EP - 250

JO - Journal of the Operations Research Society of China

JF - Journal of the Operations Research Society of China

SN - 2194-668X

IS - 2

ER -