TY - JOUR

T1 - On the Sublinear Convergence Rate of Multi-block ADMM

AU - Lin, Tian Yi

AU - Ma, Shi Qian

AU - Zhang, Shu Zhong

PY - 2015/9/13

Y1 - 2015/9/13

N2 - The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems. Despite its success in practice, the convergence of the standard ADMM for minimizing the sum of N(N⩾3) convex functions, whose variables are linked by linear constraints, has remained unclear for a very long time. Recently, Chen et al. (Math Program, doi:10.1007/s10107-014-0826-5, 2014) provided a counter-example showing that the ADMM for N⩾3 may fail to converge without further conditions. Since the ADMM for N⩾3 has been very successful when applied to many problems arising from real practice, it is worth further investigating under what kind of sufficient conditions it can be guaranteed to converge. In this paper, we present such sufficient conditions that can guarantee the sublinear convergence rate for the ADMM for N⩾3. Specifically, we show that if one of the functions is convex (not necessarily strongly convex) and the other N-1 functions are strongly convex, and the penalty parameter lies in a certain region, the ADMM converges with rate O(1/t) in a certain ergodic sense and O(1/t) in a certain non-ergodic sense, where t denotes the number of iterations. As a by-product, we also provide a simple proof for the O(1/t) convergence rate of two-block ADMM in terms of both objective error and constraint violation, without assuming any condition on the penalty parameter and strong convexity on the functions.

AB - The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems. Despite its success in practice, the convergence of the standard ADMM for minimizing the sum of N(N⩾3) convex functions, whose variables are linked by linear constraints, has remained unclear for a very long time. Recently, Chen et al. (Math Program, doi:10.1007/s10107-014-0826-5, 2014) provided a counter-example showing that the ADMM for N⩾3 may fail to converge without further conditions. Since the ADMM for N⩾3 has been very successful when applied to many problems arising from real practice, it is worth further investigating under what kind of sufficient conditions it can be guaranteed to converge. In this paper, we present such sufficient conditions that can guarantee the sublinear convergence rate for the ADMM for N⩾3. Specifically, we show that if one of the functions is convex (not necessarily strongly convex) and the other N-1 functions are strongly convex, and the penalty parameter lies in a certain region, the ADMM converges with rate O(1/t) in a certain ergodic sense and O(1/t) in a certain non-ergodic sense, where t denotes the number of iterations. As a by-product, we also provide a simple proof for the O(1/t) convergence rate of two-block ADMM in terms of both objective error and constraint violation, without assuming any condition on the penalty parameter and strong convexity on the functions.

KW - Alternating direction method of multipliers

KW - Convex optimization

KW - Sublinear convergence rate

UR - http://www.scopus.com/inward/record.url?scp=84941340210&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84941340210&partnerID=8YFLogxK

U2 - 10.1007/s40305-015-0092-0

DO - 10.1007/s40305-015-0092-0

M3 - Article

AN - SCOPUS:84941340210

VL - 3

SP - 251

EP - 274

JO - Journal of the Operations Research Society of China

JF - Journal of the Operations Research Society of China

SN - 2194-668X

IS - 3

ER -