GNSD: A Gradient-Tracking Based Nonconvex Stochastic Algorithm for Decentralized Optimization

Songtao Lu, Xinwei Zhang, Haoran Sun, Mingyi Hong

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In the era of big data, it is challenging to train a machine learning model on a single machine or over a distributed system with a central controller over a large-scale dataset. In this paper, we propose a gradient-tracking based nonconvex stochastic decentralized (GNSD) algorithm for solving nonconvex optimization problems, where the data is partitioned into multiple parts and processed by the local computational resource. Through exchanging the parameters at each node over a network, GNSD is able to find the first-order stationary points (FOSP) efficiently. From the theoretical analysis, it is guaranteed that the convergence rate of GNSD to FOSPs matches the well-known convergence rate \mathcal{O}\left( {1/\sqrt T } \right) of stochastic gradient descent by shrinking the step-size. Finally, we perform extensive numerical experiments on computational clusters to demonstrate the advantage of GNSD compared with other state-of-the-art methods.

Original languageEnglish (US)
Title of host publication2019 IEEE Data Science Workshop, DSW 2019 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages315-321
Number of pages7
ISBN (Electronic)9781728107080
DOIs
StatePublished - Jun 2019
Event2019 IEEE Data Science Workshop, DSW 2019 - Minneapolis, United States
Duration: Jun 2 2019Jun 5 2019

Publication series

Name2019 IEEE Data Science Workshop, DSW 2019 - Proceedings

Conference

Conference2019 IEEE Data Science Workshop, DSW 2019
CountryUnited States
CityMinneapolis
Period6/2/196/5/19

Fingerprint

Learning systems
Controllers
Experiments
Big data

Keywords

  • Stochastic
  • decentralized
  • gradient tracking
  • neural networks
  • nonconvex optimization

Cite this

Lu, S., Zhang, X., Sun, H., & Hong, M. (2019). GNSD: A Gradient-Tracking Based Nonconvex Stochastic Algorithm for Decentralized Optimization. In 2019 IEEE Data Science Workshop, DSW 2019 - Proceedings (pp. 315-321). [8755807] (2019 IEEE Data Science Workshop, DSW 2019 - Proceedings). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/DSW.2019.8755807

GNSD : A Gradient-Tracking Based Nonconvex Stochastic Algorithm for Decentralized Optimization. / Lu, Songtao; Zhang, Xinwei; Sun, Haoran; Hong, Mingyi.

2019 IEEE Data Science Workshop, DSW 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2019. p. 315-321 8755807 (2019 IEEE Data Science Workshop, DSW 2019 - Proceedings).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Lu, S, Zhang, X, Sun, H & Hong, M 2019, GNSD: A Gradient-Tracking Based Nonconvex Stochastic Algorithm for Decentralized Optimization. in 2019 IEEE Data Science Workshop, DSW 2019 - Proceedings., 8755807, 2019 IEEE Data Science Workshop, DSW 2019 - Proceedings, Institute of Electrical and Electronics Engineers Inc., pp. 315-321, 2019 IEEE Data Science Workshop, DSW 2019, Minneapolis, United States, 6/2/19. https://doi.org/10.1109/DSW.2019.8755807
Lu S, Zhang X, Sun H, Hong M. GNSD: A Gradient-Tracking Based Nonconvex Stochastic Algorithm for Decentralized Optimization. In 2019 IEEE Data Science Workshop, DSW 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc. 2019. p. 315-321. 8755807. (2019 IEEE Data Science Workshop, DSW 2019 - Proceedings). https://doi.org/10.1109/DSW.2019.8755807
Lu, Songtao ; Zhang, Xinwei ; Sun, Haoran ; Hong, Mingyi. / GNSD : A Gradient-Tracking Based Nonconvex Stochastic Algorithm for Decentralized Optimization. 2019 IEEE Data Science Workshop, DSW 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2019. pp. 315-321 (2019 IEEE Data Science Workshop, DSW 2019 - Proceedings).
@inproceedings{32515ff7c2fa4f949aeb2d731d65320e,
title = "GNSD: A Gradient-Tracking Based Nonconvex Stochastic Algorithm for Decentralized Optimization",
abstract = "In the era of big data, it is challenging to train a machine learning model on a single machine or over a distributed system with a central controller over a large-scale dataset. In this paper, we propose a gradient-tracking based nonconvex stochastic decentralized (GNSD) algorithm for solving nonconvex optimization problems, where the data is partitioned into multiple parts and processed by the local computational resource. Through exchanging the parameters at each node over a network, GNSD is able to find the first-order stationary points (FOSP) efficiently. From the theoretical analysis, it is guaranteed that the convergence rate of GNSD to FOSPs matches the well-known convergence rate \mathcal{O}\left( {1/\sqrt T } \right) of stochastic gradient descent by shrinking the step-size. Finally, we perform extensive numerical experiments on computational clusters to demonstrate the advantage of GNSD compared with other state-of-the-art methods.",
keywords = "Stochastic, decentralized, gradient tracking, neural networks, nonconvex optimization",
author = "Songtao Lu and Xinwei Zhang and Haoran Sun and Mingyi Hong",
year = "2019",
month = "6",
doi = "10.1109/DSW.2019.8755807",
language = "English (US)",
series = "2019 IEEE Data Science Workshop, DSW 2019 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "315--321",
booktitle = "2019 IEEE Data Science Workshop, DSW 2019 - Proceedings",

}

TY - GEN

T1 - GNSD

T2 - A Gradient-Tracking Based Nonconvex Stochastic Algorithm for Decentralized Optimization

AU - Lu, Songtao

AU - Zhang, Xinwei

AU - Sun, Haoran

AU - Hong, Mingyi

PY - 2019/6

Y1 - 2019/6

N2 - In the era of big data, it is challenging to train a machine learning model on a single machine or over a distributed system with a central controller over a large-scale dataset. In this paper, we propose a gradient-tracking based nonconvex stochastic decentralized (GNSD) algorithm for solving nonconvex optimization problems, where the data is partitioned into multiple parts and processed by the local computational resource. Through exchanging the parameters at each node over a network, GNSD is able to find the first-order stationary points (FOSP) efficiently. From the theoretical analysis, it is guaranteed that the convergence rate of GNSD to FOSPs matches the well-known convergence rate \mathcal{O}\left( {1/\sqrt T } \right) of stochastic gradient descent by shrinking the step-size. Finally, we perform extensive numerical experiments on computational clusters to demonstrate the advantage of GNSD compared with other state-of-the-art methods.

AB - In the era of big data, it is challenging to train a machine learning model on a single machine or over a distributed system with a central controller over a large-scale dataset. In this paper, we propose a gradient-tracking based nonconvex stochastic decentralized (GNSD) algorithm for solving nonconvex optimization problems, where the data is partitioned into multiple parts and processed by the local computational resource. Through exchanging the parameters at each node over a network, GNSD is able to find the first-order stationary points (FOSP) efficiently. From the theoretical analysis, it is guaranteed that the convergence rate of GNSD to FOSPs matches the well-known convergence rate \mathcal{O}\left( {1/\sqrt T } \right) of stochastic gradient descent by shrinking the step-size. Finally, we perform extensive numerical experiments on computational clusters to demonstrate the advantage of GNSD compared with other state-of-the-art methods.

KW - Stochastic

KW - decentralized

KW - gradient tracking

KW - neural networks

KW - nonconvex optimization

UR - http://www.scopus.com/inward/record.url?scp=85069450639&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85069450639&partnerID=8YFLogxK

U2 - 10.1109/DSW.2019.8755807

DO - 10.1109/DSW.2019.8755807

M3 - Conference contribution

AN - SCOPUS:85069450639

T3 - 2019 IEEE Data Science Workshop, DSW 2019 - Proceedings

SP - 315

EP - 321

BT - 2019 IEEE Data Science Workshop, DSW 2019 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -