Hybrid Block Successive Approximation for One-Sided Non-Convex Min-Max Problems: Algorithms and Applications

Songtao Lu, Ioannis Tsaknakis, Mingyi Hong, Yongxin Chen

Research output: Contribution to journalArticlepeer-review

17 Scopus citations


The min-max problem, also known as the saddle point problem, is a class of optimization problems which minimizes and maximizes two subsets of variables simultaneously. This class of problems can be used to formulate a wide range of signal processing and communication (SPCOM) problems. Despite its popularity, most existing theory for this class has been mainly developed for problems with certain special convex-concave structure. Therefore, it cannot be used to guide the algorithm design for many interesting problems in SPCOM, where various kinds of non-convexity arise. In this work, we consider a block-wise one-sided non-convex min-max problem, in which the minimization problem consists of multiple blocks and is non-convex, while the maximization problem is (strongly) concave. We propose a class of simple algorithms named Hybrid Block Successive Approximation (HiBSA), which alternatingly performs gradient descent-type steps for the minimization blocks and gradient ascent-type steps for the maximization problem. A key element in the proposed algorithm is the use of certain regularization and penalty sequences, which stabilize the algorithm and ensure convergence. We show that HiBSA converges to some properly defined first-order stationary solutions with quantifiable global rates. To validate the efficiency of the proposed algorithms, we conduct numerical tests on a number of problems, including the robust learning problem, the non-convex min-utility maximization problems, and certain wireless jamming problem arising in interfering channels.

Original languageEnglish (US)
Article number9070155
Pages (from-to)3676-3691
Number of pages16
JournalIEEE Transactions on Signal Processing
StatePublished - 2020

Bibliographical note

Funding Information:
Manuscript received February 17, 2019; revised August 13, 2019, November 22, 2019, and January 8, 2020; accepted March 3, 2020. Date of publication April 17, 2020; date of current version June 26, 2020. The associate editor coordinating the review of this manuscript and approving it for publication was Dr. Weiyu Xu. The work of Songtao Lu, Ioannis Tsaknakis, and Mingyi Hong was supported by NSF under Grants CMMI-172775 and CIF-1910385 and in part by an ARO under Grant W911NF-19-1-0247; Yongxin Chen is supported by NSF under Grant 1901599. (Songtao Lu and Ioannis Tsaknakis contributed equally to this work.) (Corresponding author: Mingyi Hong.) Songtao Lu was with the University of Minnesota, Minneapolis, MN 55455 USA. He is now with the IBM Research AI, IBM Thomas J. Watson Research Center, Yorktown Heights, NY 10598 USA (e-mail: songtao@ibm.com).

Publisher Copyright:
© 1991-2012 IEEE.


  • Min-max optimization
  • block successive approximation
  • gradient descent and ascent
  • saddle point problems


Dive into the research topics of 'Hybrid Block Successive Approximation for One-Sided Non-Convex Min-Max Problems: Algorithms and Applications'. Together they form a unique fingerprint.

Cite this