Online unsupervised learning using ensemble Gaussian processes with random features

Research output: Contribution to journalConference articlepeer-review

4 Scopus citations


Gaussian process latent variable models (GPLVMs) are powerful, yet computationally heavy tools for nonlinear dimensionality reduction. Existing scalable variants utilize low-rank kernel matrix approximants that in essence subsample the embedding space. This work develops an efficient online approach based on random features by replacing spatial with spectral subsampling. The novel approach bypasses the need for optimizing over spatial samples, without sacrificing performance. Different from GPLVM, whose performance depends on the choice of the kernel, the proposed algorithm relies on an ensemble of kernels - what allows adaptation to a wide range of operating environments. It further allows for initial exploration of a richer function space, relative to methods adhering to a single fixed kernel, followed by sequential contraction of the search space as more data become available. Tests on benchmark datasets demonstrate the effectiveness of the proposed method.

Original languageEnglish (US)
Pages (from-to)3190-3194
Number of pages5
JournalICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
StatePublished - Jun 6 2021
Event2021 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2021 - Virtual, Toronto, Canada
Duration: Jun 6 2021Jun 11 2021

Bibliographical note

Funding Information:
This work was supported in part by NSF grant 1901134.

Publisher Copyright:
© 2021 IEEE


  • Dimensionality reduction
  • Ensemble learning
  • Gaussian processes
  • Random features


Dive into the research topics of 'Online unsupervised learning using ensemble Gaussian processes with random features'. Together they form a unique fingerprint.

Cite this