Abstract
Consider the problem of minimizing the sum of two convex functions, one being smooth and the other non-smooth. In this paper, we introduce a general class of approximate proximal splitting (APS) methods for solving such minimization problems. Methods in the APS class include many well-known algorithms such as the proximal splitting method, the block coordinate descent method (BCD), and the approximate gradient projection methods for smooth convex optimization. We establish the linear convergence of APS methods under a local error bound assumption. Since the latter is known to hold for compressive sensing and sparse group LASSO problems, our analysis implies the linear convergence of the BCD method for these problems without strong convexity assumption.
Original language | English (US) |
---|---|
Pages (from-to) | 123-141 |
Number of pages | 19 |
Journal | Journal of the Operations Research Society of China |
Volume | 2 |
Issue number | 2 |
DOIs | |
State | Published - Jul 2014 |
Keywords
- Block coordinate descent method
- Convergence rate analysis
- Convex optimization
- Local error bound
- Proximal splitting method