## Abstract

We consider distributed non-convex optimization where a network of agents aims at minimizing a global function over the Stiefel manifold. The global function is represented as a finite sum of smooth local functions, where each local function is associated with one agent and agents communicate with each other over an undirected connected graph. The problem is non-convex as local functions are possibly non-convex (but smooth) and the Steifel manifold is a non-convex set. We present a decentralized Riemannian stochastic gradient method (DRSGD) with the convergence rate of O(1/√K) to a stationary point. To have exact convergence with constant stepsize, we also propose a decentralized Riemannian gradient tracking algorithm (DRGTA) with the convergence rate of O(1/K) to a stationary point. We use multi-step consensus to preserve the iteration in the local consensus region. DRGTA is the first decentralized algorithm with exact convergence for distributed optimization on Stiefel manifold.

Original language | English (US) |
---|---|

Title of host publication | Proceedings of the 38th International Conference on Machine Learning, ICML 2021 |

Publisher | ML Research Press |

Pages | 1594-1605 |

Number of pages | 12 |

ISBN (Electronic) | 9781713845065 |

State | Published - 2021 |

Externally published | Yes |

Event | 38th International Conference on Machine Learning, ICML 2021 - Virtual, Online Duration: Jul 18 2021 → Jul 24 2021 |

### Publication series

Name | Proceedings of Machine Learning Research |
---|---|

Volume | 139 |

ISSN (Electronic) | 2640-3498 |

### Conference

Conference | 38th International Conference on Machine Learning, ICML 2021 |
---|---|

City | Virtual, Online |

Period | 7/18/21 → 7/24/21 |

### Bibliographical note

Publisher Copyright:Copyright © 2021 by the author(s)