Principal component analysis (PCA) has well-documented merits for data extraction and dimensionality reduction. PCA deals with a single dataset at a time, and it is challenged when it comes to analyzing multiple datasets. Yet in certain setups, one wishes to extract the most significant information of one dataset relative to other datasets. Specifically, the interest may be on identifying or extracting features that are specific to a single target dataset but not the others. This paper presents a novel approach for such so-termed discriminative data analysis, and establishes its optimality in the least-squares sense under suitable assumptions. The criterion reveals linear combinations of variables by maximizing the ratio of the variance of the target data to that of the remainders. The novel approach solves a generalized eigenvalue problem by performing SVD just once. Numerical tests using synthetic and real datasets showcase the merits of the proposed approach relative to its competing alternatives.
|Original language||English (US)|
|Title of host publication||2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Proceedings|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||5|
|State||Published - Sep 10 2018|
|Event||2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Calgary, Canada|
Duration: Apr 15 2018 → Apr 20 2018
|Name||ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings|
|Other||2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018|
|Period||4/15/18 → 4/20/18|
Bibliographical noteFunding Information:
Work in this paper was supported in part by NIH 1R01GM104975-01 and NSF 1500713.
© 2018 IEEE.
- Dimensionality reduction
- Discriminative analytics
- Robust principal component analysis