Bregman alternating direction method of multipliers

Huahua Wang, Arindam Banerjee

Research output: Contribution to journalConference articlepeer-review

109 Scopus citations

Abstract

The mirror descent algorithm (MDA) generalizes gradient descent by using a Bregman divergence to replace squared Euclidean distance. In this paper, we similarly generalize the alternating direction method of multipliers (ADMM) to Bregman ADMM (BADMM), which allows the choice of different Bregman divergences to exploit the structure of problems. BADMM provides a unified framework for ADMM and its variants, including generalized ADMM, inexact ADMM and Bethe ADMM. We establish the global convergence and the O(1/T) iteration complexity for BADMM. In some cases, BADMM can be faster than ADMM by a factor of O(n/ln n) where n is the dimensionality. In solving the linear program of mass transportation problem, BADMM leads to massive parallelism and can easily run on GPU. BADMM is several times faster than highly optimized commercial software Gurobi.

Original languageEnglish (US)
Pages (from-to)2816-2824
Number of pages9
JournalAdvances in Neural Information Processing Systems
Volume4
Issue numberJanuary
StatePublished - 2014
Event28th Annual Conference on Neural Information Processing Systems 2014, NIPS 2014 - Montreal, Canada
Duration: Dec 8 2014Dec 13 2014

Fingerprint

Dive into the research topics of 'Bregman alternating direction method of multipliers'. Together they form a unique fingerprint.

Cite this