TY - JOUR
T1 - Large-Scale Kernel-Based Feature Extraction via Low-Rank Subspace Tracking on a Budget
AU - Sheikholeslami, Fatemeh
AU - Berberidis, Dimitris
AU - Giannakis, Georgios B.
N1 - Publisher Copyright:
© 1991-2012 IEEE.
PY - 2018/4/15
Y1 - 2018/4/15
N2 - Kernel-based methods enjoy powerful generalization capabilities in learning a variety of pattern recognition tasks. When such methods are provided with sufficient training data, broadly applicable classes of nonlinear functions can be approximated with desired accuracy. Nevertheless, inherent to the nonparametric nature of kernel-based estimators are computational and memory requirements that become prohibitive with large-scale datasets. In response to this formidable challenge, this paper puts forward a low-rank, kernel-based, feature extraction approach that is particularly tailored for online operation. A novel generative model is introduced to approximate high-dimensional (possibly infinite) features via a low-rank nonlinear subspace, the learning of which lends itself to a kernel function approximation. Offline and online solvers are developed for the subspace learning task, along with affordable versions, in which the number of stored data vectors is confined to a predefined budget. Analytical results provide performance bounds on how well the kernel matrix as well as kernel-based classification and regression tasks can be approximated by leveraging budgeted online subspace learning and feature extraction schemes. Tests on synthetic and real datasets demonstrate and benchmark the efficiency of the proposed method for dynamic nonlinear subspace tracking as well as online classification and regressions tasks.
AB - Kernel-based methods enjoy powerful generalization capabilities in learning a variety of pattern recognition tasks. When such methods are provided with sufficient training data, broadly applicable classes of nonlinear functions can be approximated with desired accuracy. Nevertheless, inherent to the nonparametric nature of kernel-based estimators are computational and memory requirements that become prohibitive with large-scale datasets. In response to this formidable challenge, this paper puts forward a low-rank, kernel-based, feature extraction approach that is particularly tailored for online operation. A novel generative model is introduced to approximate high-dimensional (possibly infinite) features via a low-rank nonlinear subspace, the learning of which lends itself to a kernel function approximation. Offline and online solvers are developed for the subspace learning task, along with affordable versions, in which the number of stored data vectors is confined to a predefined budget. Analytical results provide performance bounds on how well the kernel matrix as well as kernel-based classification and regression tasks can be approximated by leveraging budgeted online subspace learning and feature extraction schemes. Tests on synthetic and real datasets demonstrate and benchmark the efficiency of the proposed method for dynamic nonlinear subspace tracking as well as online classification and regressions tasks.
KW - Online nonlinear feature extraction
KW - budgeted learning
KW - classification
KW - kernel methods
KW - nonlinear subspace tracking
KW - regression
UR - http://www.scopus.com/inward/record.url?scp=85041495106&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85041495106&partnerID=8YFLogxK
U2 - 10.1109/TSP.2018.2802446
DO - 10.1109/TSP.2018.2802446
M3 - Article
AN - SCOPUS:85041495106
SN - 1053-587X
VL - 66
SP - 1967
EP - 1981
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
IS - 8
ER -