Abstract
Per-example gradient clipping is a key algorithmic step that enables practical differential private (DP) training for deep learning models. The choice of clipping threshold R, however, is vital for achieving high accuracy under DP. We propose an easy-to-use replacement, called automatic clipping, that eliminates the need to tune R for any DP optimizers, including DP-SGD, DP-Adam, DP-LAMB and many others. The automatic variants are as private and computationally efficient as existing DP optimizers, but require no DP-specific hyperparameters and thus make DP training as amenable as the standard non-private training. We give a rigorous convergence analysis of automatic DP-SGD in the non-convex setting, showing that it can enjoy an asymptotic convergence rate that matches the standard SGD, under a symmetric gradient noise assumption of the per-sample gradients (commonly used in the non-DP literature). We demonstrate on various language and vision tasks that automatic clipping outperforms or matches the state-of-the-art, and can be easily employed with minimal changes to existing codebases.
Original language | English (US) |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 36 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023 |
Editors | A. Oh, T. Neumann, A. Globerson, K. Saenko, M. Hardt, S. Levine |
Publisher | Neural information processing systems foundation |
ISBN (Electronic) | 9781713899921 |
State | Published - 2023 |
Externally published | Yes |
Event | 37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, United States Duration: Dec 10 2023 → Dec 16 2023 |
Publication series
Name | Advances in Neural Information Processing Systems |
---|---|
Volume | 36 |
ISSN (Print) | 1049-5258 |
Conference
Conference | 37th Conference on Neural Information Processing Systems, NeurIPS 2023 |
---|---|
Country/Territory | United States |
City | New Orleans |
Period | 12/10/23 → 12/16/23 |
Bibliographical note
Publisher Copyright:© 2023 Neural information processing systems foundation. All rights reserved.