Accelerated primal–dual proximal block coordinate updating methods for constrained convex optimization

Yangyang Xu, Shuzhong Zhang

Research output: Contribution to journalArticle

5 Citations (Scopus)

Abstract

Block coordinate update (BCU) methods enjoy low per-update computational complexity because every time only one or a few block variables would need to be updated among possibly a large number of blocks. They are also easily parallelized and thus have been particularly popular for solving problems involving large-scale dataset and/or variables. In this paper, we propose a primal–dual BCU method for solving linearly constrained convex program with multi-block variables. The method is an accelerated version of a primal–dual algorithm proposed by the authors, which applies randomization in selecting block variables to update and establishes an O(1 / t) convergence rate under convexity assumption. We show that the rate can be accelerated to O(1 / t2) if the objective is strongly convex. In addition, if one block variable is independent of the others in the objective, we then show that the algorithm can be modified to achieve a linear rate of convergence. The numerical experiments show that the accelerated method performs stably with a single set of parameters while the original method needs to tune the parameters for different datasets in order to achieve a comparable level of performance.

Original languageEnglish (US)
Pages (from-to)91-128
Number of pages38
JournalComputational Optimization and Applications
Volume70
Issue number1
DOIs
StatePublished - May 1 2018

Fingerprint

Convex optimization
Constrained optimization
Constrained Optimization
Convex Optimization
Updating
Update
Computational complexity
Multiblock
Convex Program
Large-scale Problems
Randomisation
Experiments
Convergence Rate
Convexity
Computational Complexity
Rate of Convergence
Linearly
Numerical Experiment

Keywords

  • Accelerated first-order method
  • Alternating direction method of multipliers (ADMM)
  • Block coordinate update
  • Primal–dual method

Cite this

Accelerated primal–dual proximal block coordinate updating methods for constrained convex optimization. / Xu, Yangyang; Zhang, Shuzhong.

In: Computational Optimization and Applications, Vol. 70, No. 1, 01.05.2018, p. 91-128.

Research output: Contribution to journalArticle

@article{3715d05808354a25af7a9ee691d7b333,
title = "Accelerated primal–dual proximal block coordinate updating methods for constrained convex optimization",
abstract = "Block coordinate update (BCU) methods enjoy low per-update computational complexity because every time only one or a few block variables would need to be updated among possibly a large number of blocks. They are also easily parallelized and thus have been particularly popular for solving problems involving large-scale dataset and/or variables. In this paper, we propose a primal–dual BCU method for solving linearly constrained convex program with multi-block variables. The method is an accelerated version of a primal–dual algorithm proposed by the authors, which applies randomization in selecting block variables to update and establishes an O(1 / t) convergence rate under convexity assumption. We show that the rate can be accelerated to O(1 / t2) if the objective is strongly convex. In addition, if one block variable is independent of the others in the objective, we then show that the algorithm can be modified to achieve a linear rate of convergence. The numerical experiments show that the accelerated method performs stably with a single set of parameters while the original method needs to tune the parameters for different datasets in order to achieve a comparable level of performance.",
keywords = "Accelerated first-order method, Alternating direction method of multipliers (ADMM), Block coordinate update, Primal–dual method",
author = "Yangyang Xu and Shuzhong Zhang",
year = "2018",
month = "5",
day = "1",
doi = "10.1007/s10589-017-9972-z",
language = "English (US)",
volume = "70",
pages = "91--128",
journal = "Computational Optimization and Applications",
issn = "0926-6003",
publisher = "Springer Netherlands",
number = "1",

}

TY - JOUR

T1 - Accelerated primal–dual proximal block coordinate updating methods for constrained convex optimization

AU - Xu, Yangyang

AU - Zhang, Shuzhong

PY - 2018/5/1

Y1 - 2018/5/1

N2 - Block coordinate update (BCU) methods enjoy low per-update computational complexity because every time only one or a few block variables would need to be updated among possibly a large number of blocks. They are also easily parallelized and thus have been particularly popular for solving problems involving large-scale dataset and/or variables. In this paper, we propose a primal–dual BCU method for solving linearly constrained convex program with multi-block variables. The method is an accelerated version of a primal–dual algorithm proposed by the authors, which applies randomization in selecting block variables to update and establishes an O(1 / t) convergence rate under convexity assumption. We show that the rate can be accelerated to O(1 / t2) if the objective is strongly convex. In addition, if one block variable is independent of the others in the objective, we then show that the algorithm can be modified to achieve a linear rate of convergence. The numerical experiments show that the accelerated method performs stably with a single set of parameters while the original method needs to tune the parameters for different datasets in order to achieve a comparable level of performance.

AB - Block coordinate update (BCU) methods enjoy low per-update computational complexity because every time only one or a few block variables would need to be updated among possibly a large number of blocks. They are also easily parallelized and thus have been particularly popular for solving problems involving large-scale dataset and/or variables. In this paper, we propose a primal–dual BCU method for solving linearly constrained convex program with multi-block variables. The method is an accelerated version of a primal–dual algorithm proposed by the authors, which applies randomization in selecting block variables to update and establishes an O(1 / t) convergence rate under convexity assumption. We show that the rate can be accelerated to O(1 / t2) if the objective is strongly convex. In addition, if one block variable is independent of the others in the objective, we then show that the algorithm can be modified to achieve a linear rate of convergence. The numerical experiments show that the accelerated method performs stably with a single set of parameters while the original method needs to tune the parameters for different datasets in order to achieve a comparable level of performance.

KW - Accelerated first-order method

KW - Alternating direction method of multipliers (ADMM)

KW - Block coordinate update

KW - Primal–dual method

UR - http://www.scopus.com/inward/record.url?scp=85035147606&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85035147606&partnerID=8YFLogxK

U2 - 10.1007/s10589-017-9972-z

DO - 10.1007/s10589-017-9972-z

M3 - Article

VL - 70

SP - 91

EP - 128

JO - Computational Optimization and Applications

JF - Computational Optimization and Applications

SN - 0926-6003

IS - 1

ER -