A distributed algorithm for dictionary learning over networks

Ming Min Zhao, Qingjiang Shi, Mingyi Hong

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

In this work, we present a new distributed algorithm for a non-convex and nonsmooth dictionary learning problem. The proposed algorithm, named proximal primal-dual algorithm with increasing penalty (Prox-PDA-IP), is a primal-dual scheme, where the primal step minimizes certain approximation of the augmented Lagrangian of the problem, and the dual step performs an approximate dual ascent. We provide a proof outline for convergence to stationary points, which is mainly based on constructing a new potential function that is guaranteed to decrease after some finite number of iterations. Numerical results are presented to validate the effectiveness of the proposed algorithm.

Original languageEnglish (US)
Title of host publication2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages505-509
Number of pages5
ISBN (Electronic)9781509045457
DOIs
StatePublished - Apr 19 2017
Event2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016 - Washington, United States
Duration: Dec 7 2016Dec 9 2016

Publication series

Name2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016 - Proceedings

Other

Other2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016
Country/TerritoryUnited States
CityWashington
Period12/7/1612/9/16

Bibliographical note

Funding Information:
The work of M. Hong is supported in part by NSF under Grant CCF-1526078 and by AFOSR under grant 15RT0767. The work of Q. Shi is supported by NSFC under grants 61671411 and 61302076.

Publisher Copyright:
© 2016 IEEE.

Fingerprint

Dive into the research topics of 'A distributed algorithm for dictionary learning over networks'. Together they form a unique fingerprint.

Cite this