Revisit of Estimate Sequence for Accelerated Gradient Methods

Bingcong Li, Mario Coutino, Georgios B. Giannakis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

In this paper, we revisit the problem of minimizing a convex function f(x) with Lipschitz continuous gradient via accelerated gradient methods (AGM). To do so, we consider the so-called estimate sequence (ES), a useful analysis tool for establishing the convergence of AGM. We develop a generalized ES to support Lipschitz continuous gradient on any norm, given the importance of considering non-Euclidian norms in optimization. Traditionally, ES consists of a sequence of quadratic functions that serves as surrogate functions of f(x). However, such quadratic functions preclude the possibility of supporting Lipschitz continuous gradient defined w.r.t. non-Euclidian norms. Hence, an extension of such a powerful tool to the non-Euclidian norm setting is so much needed. Such extension is accomplished through a simple yet nontrivial modification of the standard ES. Further, our analysis provides insights of how acceleration is achieved and interpretability of the involved parameters in ES. Finally, numerical tests demonstrate the convergence benefits of taking non-Euclidean norms into account.

Original languageEnglish (US)
Title of host publication2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3602-3606
Number of pages5
ISBN (Electronic)9781509066315
DOIs
StatePublished - May 2020
Event2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020 - Barcelona, Spain
Duration: May 4 2020May 8 2020

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Volume2020-May
ISSN (Print)1520-6149

Conference

Conference2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020
Country/TerritorySpain
CityBarcelona
Period5/4/205/8/20

Bibliographical note

Funding Information:
This research is supported in part by NSF 1508993, 1711471, 1901134 and the ASPIRE project (project 14926 within the STW OTP programme), financed by the Netherlands Organization for Scientific Research (NWO). Mario Coutino is partially supported by CONACYT. Emails:{lixx5599, geor-gios}@umn.edu; [email protected].

Publisher Copyright:
© 2020 IEEE.

Keywords

  • Nesterov's accelerated gradient method
  • estimate sequences
  • gradient descent
  • optimization

Fingerprint

Dive into the research topics of 'Revisit of Estimate Sequence for Accelerated Gradient Methods'. Together they form a unique fingerprint.

Cite this