Accelerating Frank-Wolfe with weighted average gradients

Yilang Zhang, Bingcong Li, Georgios B. Giannakis

Research output: Contribution to journalConference articlepeer-review

Abstract

Relying on a conditional gradient based iteration, the Frank-Wolfe (FW) algorithm has been a popular solver of constrained convex optimization problems in signal processing and machine learning, thanks to its low complexity. The present contribution broadens its scope by replacing the gradient per FW subproblem with a weighted average of gradients. This generalization speeds up the convergence of FW by alleviating its zigzag behavior. A geometric interpretation for the averaged gradients is provided, and convergence guarantees are established for three different weight combinations. Numerical comparison shows the effectiveness of the proposed methods.

Original languageEnglish (US)
Pages (from-to)5529-5533
Number of pages5
JournalICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Volume2021-June
DOIs
StatePublished - Jun 6 2021
Event2021 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2021 - Virtual, Toronto, Canada
Duration: Jun 6 2021Jun 11 2021

Bibliographical note

Funding Information:
Research in this paper was supported in part by the NSF grant 1901134. Emails:{zhan7453,lixx5599, georgios}@umn.edu

Publisher Copyright:
©2021 IEEE

Keywords

  • Conditional gradient approach
  • Convex optimization
  • Frank-Wolfe method

Fingerprint

Dive into the research topics of 'Accelerating Frank-Wolfe with weighted average gradients'. Together they form a unique fingerprint.

Cite this