Abstract
Gaussian process latent variable models (GPLVMs) are powerful, yet computationally heavy tools for nonlinear dimensionality reduction. Existing scalable variants utilize low-rank kernel matrix approximants that in essence subsample the embedding space. This work develops an efficient online approach based on random features by replacing spatial with spectral subsampling. The novel approach bypasses the need for optimizing over spatial samples, without sacrificing performance. Different from GPLVM, whose performance depends on the choice of the kernel, the proposed algorithm relies on an ensemble of kernels - what allows adaptation to a wide range of operating environments. It further allows for initial exploration of a richer function space, relative to methods adhering to a single fixed kernel, followed by sequential contraction of the search space as more data become available. Tests on benchmark datasets demonstrate the effectiveness of the proposed method.
Original language | English (US) |
---|---|
Pages (from-to) | 3190-3194 |
Number of pages | 5 |
Journal | ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings |
Volume | 2021-June |
DOIs | |
State | Published - Jun 6 2021 |
Event | 2021 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2021 - Virtual, Toronto, Canada Duration: Jun 6 2021 → Jun 11 2021 |
Bibliographical note
Funding Information:This work was supported in part by NSF grant 1901134.
Publisher Copyright:
© 2021 IEEE
Keywords
- Dimensionality reduction
- Ensemble learning
- Gaussian processes
- Random features