Generalized dantzig selector: Application to the K-support norm

Soumyadeep Chatterjee, Sheng Chen, Arindam Banerjee

Research output: Contribution to journalConference articlepeer-review

24 Scopus citations


We propose a Generalized Dantzig Selector (GDS) for linear models, in which any norm encoding the parameter structure can be leveraged for estimation. We investigate both computational and statistical aspects of the GDS. Based on conjugate proximal operator, a flexible inexact ADMM framework is designed for solving GDS. Thereafter, non-asymptotic high-probability bounds are established on the estimation error, which rely on Gaussian widths of the unit norm ball and the error set. Further, we consider a non-trivial example of the GDS using k-support norm. We derive an efficient method to compute the proximal operator for k-support norm since existing methods are inapplicable in this setting. For statistical analysis, we provide upper bounds for the Gaussian widths needed in the GDS analysis, yielding the first statistical recovery guarantee for estimation with the k-support norm. The experimental results confirm our theoretical analysis.

Original languageEnglish (US)
Pages (from-to)1934-1942
Number of pages9
JournalAdvances in Neural Information Processing Systems
Issue numberJanuary
StatePublished - 2014
Event28th Annual Conference on Neural Information Processing Systems 2014, NIPS 2014 - Montreal, Canada
Duration: Dec 8 2014Dec 13 2014


Dive into the research topics of 'Generalized dantzig selector: Application to the K-support norm'. Together they form a unique fingerprint.

Cite this