Distributed lasso for in-network linear regression

Juan Andrés Bazerque, Gonzalo Mateos, Georgios B. Giannakis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Scopus citations

Abstract

The least-absolute shrinkage and selection operator (Lasso) is a popular tool for joint estimation and continuous variable selection, especially well-suited for the under-determined but sparse linear regression problems. This paper develops an algorithm to estimate the regression coefficients via Lasso when the training data is distributed across different agents, and their communication to a central processing unit is prohibited for e.g., communication cost or privacy reasons. The novel distributed algorithm is obtained after reformulating the Lasso into a separable form, which is iteratively minimized using the alternating-direction method of multipliers so as to gain the desired degree of parallelization. The per agent estimate updates are given by simple soft-thresholding operations, and interagent communication overhead remains at affordable level. Without exchanging elements from the different training sets, the local estimates provably consent to the global Lasso solution, i.e., the fit that would be obtained if the entire data set were centrally available. Numerical experiments corroborate the convergence and global optimality of the proposed distributed scheme.

Original languageEnglish (US)
Title of host publication2010 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2010 - Proceedings
Pages2978-2981
Number of pages4
DOIs
StatePublished - Nov 8 2010
Event2010 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2010 - Dallas, TX, United States
Duration: Mar 14 2010Mar 19 2010

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
ISSN (Print)1520-6149

Other

Other2010 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2010
CountryUnited States
CityDallas, TX
Period3/14/103/19/10

Keywords

  • Distributed estimation
  • Lasso
  • Sparse regression

Fingerprint Dive into the research topics of 'Distributed lasso for in-network linear regression'. Together they form a unique fingerprint.

Cite this