Abstract
Recently, there is a growing interest in the study of median-based algorithms for distributed non-convex optimization. Two prominent examples include SIGNSGD with majority vote, an effective approach for communication reduction via 1-bit compression on the local gradients, and MEDIANSGD, an algorithm recently proposed to ensure robustness against Byzantine workers. The convergence analyses for these algorithms critically rely on the assumption that all the distributed data are drawn iid from the same distribution. However, in applications such as Federated Learning, the data across different nodes or machines can be inherently heterogeneous, which violates such an iid assumption. This work analyzes SIGNSGD and MEDIANSGD in distributed settings with heterogeneous data. We show that these algorithms are non-convergent whenever there is some disparity between the expected median and mean over the local gradients. To overcome this gap, we provide a novel gradient correction mechanism that perturbs the local gradients with noise, which we show can provably close the gap between mean and median of the gradients. The proposed methods largely preserve nice properties of these median-based algorithms, such as the low per-iteration communication complexity of SIGNSGD, and further enjoy global convergence to stationary solutions. Our perturbation technique can be of independent interest when one wishes to estimate mean through a median estimator.
Original language | English (US) |
---|---|
Journal | Advances in Neural Information Processing Systems |
Volume | 2020-December |
State | Published - 2020 |
Event | 34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online Duration: Dec 6 2020 → Dec 12 2020 |
Bibliographical note
Funding Information:The research is supported in part by grants AFOSR-19RT0424, ARO-W911NF-19-1-0247, a Google Faculty Research Award, a J.P. Morgan Faculty Award, and a Facebook Research Award.
Publisher Copyright:
© 2020 Neural information processing systems foundation. All rights reserved.