Fast convergent algorithms for multi-kernel regression

Liang Zhang, Daniel Romero, Georgios B. Giannakis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Scopus citations


Kernel ridge regression plays a central role in various signal processing and machine learning applications. Suitable kernels are often chosen as linear combinations of 'basis kernels' by optimizing criteria under regularization constraints. Although such approaches offer reliable generalization performance, solving the associated min-max optimization problems face major challenges, especially with big data inputs. After analyzing the key properties of a convex reformulation, the present paper introduces an efficient algorithm based on a generalization of Nesterov's acceleration method, which achieves order-optimal convergence rate among first-order methods. Closed-form updates are derived for common regularizers. Experiments on real datasets corroborate considerable speedup advantages over competing algorithms.

Original languageEnglish (US)
Title of host publication2016 19th IEEE Statistical Signal Processing Workshop, SSP 2016
PublisherIEEE Computer Society
ISBN (Electronic)9781467378024
StatePublished - Aug 24 2016
Event19th IEEE Statistical Signal Processing Workshop, SSP 2016 - Palma de Mallorca, Spain
Duration: Jun 25 2016Jun 29 2016

Publication series

NameIEEE Workshop on Statistical Signal Processing Proceedings


Other19th IEEE Statistical Signal Processing Workshop, SSP 2016
CityPalma de Mallorca

Bibliographical note

Publisher Copyright:
© 2016 IEEE.


  • Bregman divergence
  • Kernel ridge regression
  • Nesterov's accelerated gradient method
  • multi-kernel learning


Dive into the research topics of 'Fast convergent algorithms for multi-kernel regression'. Together they form a unique fingerprint.

Cite this