Convergence of the huber regression m-estimate in the presence of dense outliers

Efthymios Tsakonas, Joakim Jalden, Nicholas D. Sidiropoulos, Bjorn Ottersten

Research output: Contribution to journalArticlepeer-review

17 Scopus citations


We consider the problem of estimating a deterministic unknown vector which depends linearly on n noisy measurements, additionally contaminated with (possibly unbounded) additive outliers. The measurement matrix of the model (i.e., the matrix involved in the linear transformation of the sought vector) is assumed known, and comprised of standard Gaussian i.i.d. entries. The outlier variables are assumed independent of the measurement matrix, deterministic or random with possibly unknown distribution. Under these assumptions we provide a simple proof that the minimizer of the Huber penalty function of the residuals converges to the true parameter vector with a n -rate, even when outliers are dense, in the sense that there is a constant linear fraction of contaminated measurements which can be arbitrarily close to one. The constants influencing the rate of convergence are shown to explicitly depend on the outlier contamination level.

Original languageEnglish (US)
Article number6828704
Pages (from-to)1211-1214
Number of pages4
JournalIEEE Signal Processing Letters
Issue number10
StatePublished - Oct 2014


  • Breakdown point (BP)
  • Huber estimator
  • dense outliers
  • performance analysis


Dive into the research topics of 'Convergence of the huber regression m-estimate in the presence of dense outliers'. Together they form a unique fingerprint.

Cite this