Abstract
Zhang, Lin, Jegelka, Sra, and Jadbabaie [1] introduce a method for minimizing Lipschitz functions with an efficiency guarantee of O(ε-4). Their method is a novel modification of Goldstein's classical subgradient method. Their work, however, makes use of a nonstandard subgradient oracle model and requires the function to be directionally differentiable. Our first contribution in this paper is to show that both of these assumptions can be dropped by simply adding a small random perturbation in each step of their algorithm. The resulting method works on any Lipschitz function whose value and gradient can be evaluated at points of differentiability. Our second contribution is a new cutting plane algorithm that achieves better efficiency in low dimensions: O(dε-3) for Lipschitz functions and O(dε-2) for those that are weakly convex.
| Original language | English (US) |
|---|---|
| Title of host publication | Advances in Neural Information Processing Systems 35 - 36th Conference on Neural Information Processing Systems, NeurIPS 2022 |
| Editors | S. Koyejo, S. Mohamed, A. Agarwal, D. Belgrave, K. Cho, A. Oh |
| Publisher | Neural information processing systems foundation |
| ISBN (Electronic) | 9781713871088 |
| State | Published - 2022 |
| Externally published | Yes |
| Event | 36th Conference on Neural Information Processing Systems, NeurIPS 2022 - New Orleans, United States Duration: Nov 28 2022 → Dec 9 2022 |
Publication series
| Name | Advances in Neural Information Processing Systems |
|---|---|
| Volume | 35 |
| ISSN (Print) | 1049-5258 |
Conference
| Conference | 36th Conference on Neural Information Processing Systems, NeurIPS 2022 |
|---|---|
| Country/Territory | United States |
| City | New Orleans |
| Period | 11/28/22 → 12/9/22 |
Bibliographical note
Publisher Copyright:© 2022 Neural information processing systems foundation. All rights reserved.