Zeroth-order stochastic projected gradient descent for nonconvex optimization

Sijia Liu, Xingguo Li, Pin Yu Chen, Jarvis D Haupt, Lisa Amini

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we analyze the convergence of the zeroth-order stochastic projected gradient descent (ZO-SPGD) method for constrained convex and nonconvex optimization scenarios where only objective function values (not gradients) are directly available. We show statistical properties of a new random gradient estimator, constructed through random direction samples drawn from a bounded uniform distribution. We prove that ZO-SPGD yields a O( fracdbqsqrt T + frac1sqrt T right) convergence rate for convex but non-smooth optimization, where d is the number of optimization variables, b is the minibatch size, q is the number of random direction samples for gradient estimation, and T is the number of iterations. For nonconvex optimization, we show that ZO-SPGD achieves O( frac1sqrt T right) convergence rate but suffers an additional O( fracd + qbq right) error. Our the oretical investigation on ZO-SPGD provides a general framework to study the convergence rate of zeroth-order algorithms.

Original languageEnglish (US)
Title of host publication2018 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2018 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1179-1183
Number of pages5
ISBN (Electronic)9781728112954
DOIs
StatePublished - Feb 20 2019
Event2018 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2018 - Anaheim, United States
Duration: Nov 26 2018Nov 29 2018

Publication series

Name2018 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2018 - Proceedings

Conference

Conference2018 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2018
CountryUnited States
CityAnaheim
Period11/26/1811/29/18

Keywords

  • Nonconvex optimization
  • Projected gradient descent
  • Zeroth-order optimization

Fingerprint Dive into the research topics of 'Zeroth-order stochastic projected gradient descent for nonconvex optimization'. Together they form a unique fingerprint.

  • Cite this

    Liu, S., Li, X., Chen, P. Y., Haupt, J. D., & Amini, L. (2019). Zeroth-order stochastic projected gradient descent for nonconvex optimization. In 2018 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2018 - Proceedings (pp. 1179-1183). [8646618] (2018 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2018 - Proceedings). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/GlobalSIP.2018.8646618