Abstract
We consider distributed optimization with smooth convex objective functions defined on an undirected connected graph. Inspired by mirror descent mehod and RLC circuits, we propose a novel distributed mirror descent method. Compared with mirror-prox method, our algorithm achieves the same $\mathcal {O}$ (/k$) iteration complexity with only half the computation cost per iteration. We further extend our results to cases where a) gradients are corrupted by stochastic noise, and b) objective function is composed of both smooth and non-smooth terms. We demonstrate our theoretical results via numerical experiments.
Original language | English (US) |
---|---|
Article number | 8993740 |
Pages (from-to) | 548-553 |
Number of pages | 6 |
Journal | IEEE Control Systems Letters |
Volume | 4 |
Issue number | 3 |
DOIs | |
State | Published - Jul 2020 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2017 IEEE.
Keywords
- Distributed optimization
- mirror descent