Abstract
The alternating direction method of multipliers (ADMM) has been successfully applied to solve structured convex optimization problems due to its superior practical performance. The convergence properties of the 2-block ADMM have been studied extensively in the literature. Specifically, it has been proven that the 2-block ADMM globally converges for any penalty parameter γ> 0. In this sense, the 2-block ADMM allows the parameter to be free, i.e., there is no need to restrict the value for the parameter when implementing this algorithm in order to ensure convergence. However, for the 3-block ADMM, Chen et al. (Math Program 155:57–79, 2016) recently constructed a counter-example showing that it can diverge if no further condition is imposed. The existing results on studying further sufficient conditions on guaranteeing the convergence of the 3-block ADMM usually require γ to be smaller than a certain bound, which is usually either difficult to compute or too small to make it a practical algorithm. In this paper, we show that the 3-block ADMM still globally converges with any penalty parameter γ> 0 if the third function f3 in the objective is smooth and strongly convex, and its condition number is in [1, 1.0798), besides some other mild conditions. This requirement covers an important class of problems to be called regularized least squares decomposition (RLSD) in this paper.
Original language | English (US) |
---|---|
Pages (from-to) | 69-88 |
Number of pages | 20 |
Journal | Journal of Scientific Computing |
Volume | 76 |
Issue number | 1 |
DOIs | |
State | Published - Jul 1 2018 |
Bibliographical note
Publisher Copyright:© 2017, Springer Science+Business Media, LLC, part of Springer Nature.
Keywords
- ADMM
- Convex minimization
- Global convergence
- Regularized least squares decomposition