Large-Scale Kernel-Based Feature Extraction via Low-Rank Subspace Tracking on a Budget

Fatemeh Sheikholeslami, Dimitris Berberidis, Georgios B. Giannakis

Research output: Contribution to journalArticle

4 Scopus citations


Kernel-based methods enjoy powerful generalization capabilities in learning a variety of pattern recognition tasks. When such methods are provided with sufficient training data, broadly applicable classes of nonlinear functions can be approximated with desired accuracy. Nevertheless, inherent to the nonparametric nature of kernel-based estimators are computational and memory requirements that become prohibitive with large-scale datasets. In response to this formidable challenge, this paper puts forward a low-rank, kernel-based, feature extraction approach that is particularly tailored for online operation. A novel generative model is introduced to approximate high-dimensional (possibly infinite) features via a low-rank nonlinear subspace, the learning of which lends itself to a kernel function approximation. Offline and online solvers are developed for the subspace learning task, along with affordable versions, in which the number of stored data vectors is confined to a predefined budget. Analytical results provide performance bounds on how well the kernel matrix as well as kernel-based classification and regression tasks can be approximated by leveraging budgeted online subspace learning and feature extraction schemes. Tests on synthetic and real datasets demonstrate and benchmark the efficiency of the proposed method for dynamic nonlinear subspace tracking as well as online classification and regressions tasks.

Original languageEnglish (US)
Pages (from-to)1967-1981
Number of pages15
JournalIEEE Transactions on Signal Processing
Issue number8
StatePublished - Apr 15 2018



  • Online nonlinear feature extraction
  • budgeted learning
  • classification
  • kernel methods
  • nonlinear subspace tracking
  • regression

Cite this