Randomized Block Frank-Wolfe for Convergent Large-Scale Learning

Liang Zhang, Gang Wang, Daniel Romero, Georgios B Giannakis

Research output: Contribution to journalArticle

5 Citations (Scopus)

Abstract

Owing to their low-complexity iterations, Frank-Wolfe (FW) solvers are well suited for various large-scale learning tasks. When block-separable constraints are present, randomized block FW (RB-FW) has been shown to further reduce complexity by updating only a fraction of coordinate blocks per iteration. To circumvent the limitations of existing methods, this paper develops step sizes for RB-FW that enable a flexible selection of the number of blocks to update per iteration while ensuring convergence and feasibility of the iterates. To this end, convergence rates of RB-FW are established through computational bounds on a primal suboptimality measure and on the duality gap. The novel bounds extend the existing convergence analysis, which only applies to a step-size sequence that does not generally lead to feasible iterates. Furthermore, two classes of step-size sequences that guarantee feasibility of the iterates are also proposed to enhance flexibility in choosing decay rates. The novel convergence results are markedly broadened to also encompass nonconvex objectives, and further assert that RB-FW with exact line-search reaches a stationary point at rate O(1/√t). Performance of RB-FW with different step sizes and number of blocks is demonstrated in two applications, namely charging of electrical vehicles and structural support vector machines. Extensive simulated tests demonstrate the performance improvement of RB-FW relative to existing randomized single-block FW methods.

Original languageEnglish (US)
Article number8047993
Pages (from-to)6448-6461
Number of pages14
JournalIEEE Transactions on Signal Processing
Volume65
Issue number24
DOIs
StatePublished - Dec 15 2017

Fingerprint

Support vector machines

Keywords

  • Conditional gradient descent
  • block coordinate
  • nonconvex optimization
  • parallel optimization

Cite this

Randomized Block Frank-Wolfe for Convergent Large-Scale Learning. / Zhang, Liang; Wang, Gang; Romero, Daniel; Giannakis, Georgios B.

In: IEEE Transactions on Signal Processing, Vol. 65, No. 24, 8047993, 15.12.2017, p. 6448-6461.

Research output: Contribution to journalArticle

@article{c90b3f6f590e42ea8b5add4595e73e97,
title = "Randomized Block Frank-Wolfe for Convergent Large-Scale Learning",
abstract = "Owing to their low-complexity iterations, Frank-Wolfe (FW) solvers are well suited for various large-scale learning tasks. When block-separable constraints are present, randomized block FW (RB-FW) has been shown to further reduce complexity by updating only a fraction of coordinate blocks per iteration. To circumvent the limitations of existing methods, this paper develops step sizes for RB-FW that enable a flexible selection of the number of blocks to update per iteration while ensuring convergence and feasibility of the iterates. To this end, convergence rates of RB-FW are established through computational bounds on a primal suboptimality measure and on the duality gap. The novel bounds extend the existing convergence analysis, which only applies to a step-size sequence that does not generally lead to feasible iterates. Furthermore, two classes of step-size sequences that guarantee feasibility of the iterates are also proposed to enhance flexibility in choosing decay rates. The novel convergence results are markedly broadened to also encompass nonconvex objectives, and further assert that RB-FW with exact line-search reaches a stationary point at rate O(1/√t). Performance of RB-FW with different step sizes and number of blocks is demonstrated in two applications, namely charging of electrical vehicles and structural support vector machines. Extensive simulated tests demonstrate the performance improvement of RB-FW relative to existing randomized single-block FW methods.",
keywords = "Conditional gradient descent, block coordinate, nonconvex optimization, parallel optimization",
author = "Liang Zhang and Gang Wang and Daniel Romero and Giannakis, {Georgios B}",
year = "2017",
month = "12",
day = "15",
doi = "10.1109/TSP.2017.2755597",
language = "English (US)",
volume = "65",
pages = "6448--6461",
journal = "IEEE Transactions on Signal Processing",
issn = "1053-587X",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "24",

}

TY - JOUR

T1 - Randomized Block Frank-Wolfe for Convergent Large-Scale Learning

AU - Zhang, Liang

AU - Wang, Gang

AU - Romero, Daniel

AU - Giannakis, Georgios B

PY - 2017/12/15

Y1 - 2017/12/15

N2 - Owing to their low-complexity iterations, Frank-Wolfe (FW) solvers are well suited for various large-scale learning tasks. When block-separable constraints are present, randomized block FW (RB-FW) has been shown to further reduce complexity by updating only a fraction of coordinate blocks per iteration. To circumvent the limitations of existing methods, this paper develops step sizes for RB-FW that enable a flexible selection of the number of blocks to update per iteration while ensuring convergence and feasibility of the iterates. To this end, convergence rates of RB-FW are established through computational bounds on a primal suboptimality measure and on the duality gap. The novel bounds extend the existing convergence analysis, which only applies to a step-size sequence that does not generally lead to feasible iterates. Furthermore, two classes of step-size sequences that guarantee feasibility of the iterates are also proposed to enhance flexibility in choosing decay rates. The novel convergence results are markedly broadened to also encompass nonconvex objectives, and further assert that RB-FW with exact line-search reaches a stationary point at rate O(1/√t). Performance of RB-FW with different step sizes and number of blocks is demonstrated in two applications, namely charging of electrical vehicles and structural support vector machines. Extensive simulated tests demonstrate the performance improvement of RB-FW relative to existing randomized single-block FW methods.

AB - Owing to their low-complexity iterations, Frank-Wolfe (FW) solvers are well suited for various large-scale learning tasks. When block-separable constraints are present, randomized block FW (RB-FW) has been shown to further reduce complexity by updating only a fraction of coordinate blocks per iteration. To circumvent the limitations of existing methods, this paper develops step sizes for RB-FW that enable a flexible selection of the number of blocks to update per iteration while ensuring convergence and feasibility of the iterates. To this end, convergence rates of RB-FW are established through computational bounds on a primal suboptimality measure and on the duality gap. The novel bounds extend the existing convergence analysis, which only applies to a step-size sequence that does not generally lead to feasible iterates. Furthermore, two classes of step-size sequences that guarantee feasibility of the iterates are also proposed to enhance flexibility in choosing decay rates. The novel convergence results are markedly broadened to also encompass nonconvex objectives, and further assert that RB-FW with exact line-search reaches a stationary point at rate O(1/√t). Performance of RB-FW with different step sizes and number of blocks is demonstrated in two applications, namely charging of electrical vehicles and structural support vector machines. Extensive simulated tests demonstrate the performance improvement of RB-FW relative to existing randomized single-block FW methods.

KW - Conditional gradient descent

KW - block coordinate

KW - nonconvex optimization

KW - parallel optimization

UR - http://www.scopus.com/inward/record.url?scp=85030623725&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85030623725&partnerID=8YFLogxK

U2 - 10.1109/TSP.2017.2755597

DO - 10.1109/TSP.2017.2755597

M3 - Article

VL - 65

SP - 6448

EP - 6461

JO - IEEE Transactions on Signal Processing

JF - IEEE Transactions on Signal Processing

SN - 1053-587X

IS - 24

M1 - 8047993

ER -