An optimal high-order tensor method for convex optimization

Bo Jiang, Haoyue Wang, Shuzhong Zhang

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

This paper is concerned with finding an optimal algorithm for minimizing a composite convex objective function. The basic setting is that the objective is the sum of two convex functions: the first function is smooth with up to the dth-order derivative information available, and the second function is possibly nonsmooth, but its proximal tensor mappings can be computed approximately in an efficient manner. The problem is to find—in that setting—the best possible (optimal) iteration complexity for convex optimization. Along that line, for the smooth case (without the second nonsmooth part in the objective), Nesterov proposed an optimal algorithm for the first-order methods (d = 1) with iteration complexity O(1/k2), whereas high-order tensor algorithms (using up to general dth-order tensor information) with iteration complexity O(1/kd+1) were recently established. In this paper, we propose a new high-order tensor algorithm for the general composite case, with the iteration complexity of O(1/k(3d+1)/2), which matches the lower bound for the dth-order methods as previously established and hence is optimal. Our approach is based on the accelerated hybrid proximal extragradient (A-HPE) framework proposed by Monteiro and Svaiter, where a bisection procedure is installed for each A-HPE iteration. At each bisection step, a proximal tensor subproblem is approximately solved, and the total number of bisection steps per A-HPE iteration is shown to be bounded by a logarithmic factor in the precision required.

Original languageEnglish (US)
Pages (from-to)1390-1412
Number of pages23
JournalMathematics of Operations Research
Volume46
Issue number4
DOIs
StatePublished - Nov 2021
Externally publishedYes

Bibliographical note

Funding Information:
Funding: B. Jiang’s research was supported in part by the National Natural Science Foundation of China [Grants 11771269 and 11831002] and Program for Innovative Research Team of Shanghai University of Finance and Economics. S. Zhang’s research was supported in part by the National Science Foundation [Grant CMMI-1462408].

Publisher Copyright:
Copyright: © 2021 INFORMS

Keywords

  • Acceleration
  • Convex optimization
  • Iteration complexity
  • Tensor method

Fingerprint

Dive into the research topics of 'An optimal high-order tensor method for convex optimization'. Together they form a unique fingerprint.

Cite this