Adaptive learning in complex reproducing kernel hilbert spaces employing wirtinger's subgradients

Pantelis Bouboulis, Konstantinos Slavakis, Sergios Theodoridis

Research output: Contribution to journalArticlepeer-review

41 Scopus citations


This paper presents a wide framework for non-linear online supervised learning tasks in the context of complex valued signal processing. The (complex) input data are mapped into a complex reproducing kernel Hilbert space (RKHS), where the learning phase is taking place. Both pure complex kernels and real kernels (via the complexification trick) can be employed. Moreover, any convex, continuous and not necessarily differentiable function can be used to measure the loss between the output of the specific system and the desired response. The only requirement is the subgradient of the adopted loss function to be available in an analytic form. In order to derive analytically the subgradients, the principles of the (recently developed) Wirtinger's calculus in complex RKHS are exploited. Furthermore, both linear and widely linear (in RKHS) estimation filters are considered. To cope with the problem of increasing memory requirements, which is present in almost all online schemes in RKHS, the sparsification scheme, based on projection onto closed balls, has been adopted. We demonstrate the effectiveness of the proposed framework in a non-linear channel identification task, a non-linear channel equalization problem and a quadrature phase shift keying equalization scheme, using both circular and non circular synthetic signal sources.

Original languageEnglish (US)
Article number6126047
Pages (from-to)425-438
Number of pages14
JournalIEEE Transactions on Neural Networks and Learning Systems
Issue number3
StatePublished - Dec 1 2012


  • Adaptive kernel learning
  • Wirtinger's calculus
  • complex kernels
  • projection
  • subgradient
  • widely linear estimation

Fingerprint Dive into the research topics of 'Adaptive learning in complex reproducing kernel hilbert spaces employing wirtinger's subgradients'. Together they form a unique fingerprint.

Cite this