Find the dimension that counts: Fast dimension estimation and Krylov PCA

Shashanka Ubaru, Abd Krim Seghouane, Yousef Saad

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

High dimensional data and systems with many degrees of freedom are often characterized by covariance matrices. In this paper, we consider the problem of simultaneously estimating the dimension of the principal (dominant) subspace of these covariance matrices and obtaining an approximation to the subspace. This problem arises in the popular principal component analysis (PCA), and in many applications of machine learning, data analysis, signal and image processing, and others. We first present a novel method for estimating the dimension of the principal subspace. We then show how this method can be coupled with a Krylov subspace method to simultaneously estimate the dimension and obtain an approximation to the subspace. The dimension estimation is achieved at no additional cost. The proposed method operates on a model selection framework, where the novel selection criterion is derived based on random matrix perturbation theory ideas. We present theoretical analyses which (a) show that the proposed method achieves strong consistency (i.e., yields optimal solution as the number of data-points n → ∞), and (b) analyze conditions for exact dimension estimation in the finite n case. Using recent results, we show that our algorithm also yields near optimal PCA. The proposed method avoids forming the sample covariance matrix (associated with the data) explicitly and computing the complete eigen-decomposition. Therefore, the method is inexpensive, which is particularly advantageous in modern data applications where the covariance matrices can be very large. Numerical experiments illustrate the performance of the proposed method in various applications.

Original languageEnglish (US)
Title of host publicationSIAM International Conference on Data Mining, SDM 2019
PublisherSociety for Industrial and Applied Mathematics Publications
Pages720-728
Number of pages9
ISBN (Electronic)9781611975673
StatePublished - Jan 1 2019
Event19th SIAM International Conference on Data Mining, SDM 2019 - Calgary, Canada
Duration: May 2 2019May 4 2019

Publication series

NameSIAM International Conference on Data Mining, SDM 2019

Conference

Conference19th SIAM International Conference on Data Mining, SDM 2019
CountryCanada
CityCalgary
Period5/2/195/4/19

Fingerprint

Covariance matrix
Principal component analysis
Learning systems
Signal processing
Image processing
Decomposition
Costs
Experiments

Cite this

Ubaru, S., Seghouane, A. K., & Saad, Y. (2019). Find the dimension that counts: Fast dimension estimation and Krylov PCA. In SIAM International Conference on Data Mining, SDM 2019 (pp. 720-728). (SIAM International Conference on Data Mining, SDM 2019). Society for Industrial and Applied Mathematics Publications.

Find the dimension that counts : Fast dimension estimation and Krylov PCA. / Ubaru, Shashanka; Seghouane, Abd Krim; Saad, Yousef.

SIAM International Conference on Data Mining, SDM 2019. Society for Industrial and Applied Mathematics Publications, 2019. p. 720-728 (SIAM International Conference on Data Mining, SDM 2019).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ubaru, S, Seghouane, AK & Saad, Y 2019, Find the dimension that counts: Fast dimension estimation and Krylov PCA. in SIAM International Conference on Data Mining, SDM 2019. SIAM International Conference on Data Mining, SDM 2019, Society for Industrial and Applied Mathematics Publications, pp. 720-728, 19th SIAM International Conference on Data Mining, SDM 2019, Calgary, Canada, 5/2/19.
Ubaru S, Seghouane AK, Saad Y. Find the dimension that counts: Fast dimension estimation and Krylov PCA. In SIAM International Conference on Data Mining, SDM 2019. Society for Industrial and Applied Mathematics Publications. 2019. p. 720-728. (SIAM International Conference on Data Mining, SDM 2019).
Ubaru, Shashanka ; Seghouane, Abd Krim ; Saad, Yousef. / Find the dimension that counts : Fast dimension estimation and Krylov PCA. SIAM International Conference on Data Mining, SDM 2019. Society for Industrial and Applied Mathematics Publications, 2019. pp. 720-728 (SIAM International Conference on Data Mining, SDM 2019).
@inproceedings{a8edb54a0461452fbe7cea132d7b551c,
title = "Find the dimension that counts: Fast dimension estimation and Krylov PCA",
abstract = "High dimensional data and systems with many degrees of freedom are often characterized by covariance matrices. In this paper, we consider the problem of simultaneously estimating the dimension of the principal (dominant) subspace of these covariance matrices and obtaining an approximation to the subspace. This problem arises in the popular principal component analysis (PCA), and in many applications of machine learning, data analysis, signal and image processing, and others. We first present a novel method for estimating the dimension of the principal subspace. We then show how this method can be coupled with a Krylov subspace method to simultaneously estimate the dimension and obtain an approximation to the subspace. The dimension estimation is achieved at no additional cost. The proposed method operates on a model selection framework, where the novel selection criterion is derived based on random matrix perturbation theory ideas. We present theoretical analyses which (a) show that the proposed method achieves strong consistency (i.e., yields optimal solution as the number of data-points n → ∞), and (b) analyze conditions for exact dimension estimation in the finite n case. Using recent results, we show that our algorithm also yields near optimal PCA. The proposed method avoids forming the sample covariance matrix (associated with the data) explicitly and computing the complete eigen-decomposition. Therefore, the method is inexpensive, which is particularly advantageous in modern data applications where the covariance matrices can be very large. Numerical experiments illustrate the performance of the proposed method in various applications.",
author = "Shashanka Ubaru and Seghouane, {Abd Krim} and Yousef Saad",
year = "2019",
month = "1",
day = "1",
language = "English (US)",
series = "SIAM International Conference on Data Mining, SDM 2019",
publisher = "Society for Industrial and Applied Mathematics Publications",
pages = "720--728",
booktitle = "SIAM International Conference on Data Mining, SDM 2019",

}

TY - GEN

T1 - Find the dimension that counts

T2 - Fast dimension estimation and Krylov PCA

AU - Ubaru, Shashanka

AU - Seghouane, Abd Krim

AU - Saad, Yousef

PY - 2019/1/1

Y1 - 2019/1/1

N2 - High dimensional data and systems with many degrees of freedom are often characterized by covariance matrices. In this paper, we consider the problem of simultaneously estimating the dimension of the principal (dominant) subspace of these covariance matrices and obtaining an approximation to the subspace. This problem arises in the popular principal component analysis (PCA), and in many applications of machine learning, data analysis, signal and image processing, and others. We first present a novel method for estimating the dimension of the principal subspace. We then show how this method can be coupled with a Krylov subspace method to simultaneously estimate the dimension and obtain an approximation to the subspace. The dimension estimation is achieved at no additional cost. The proposed method operates on a model selection framework, where the novel selection criterion is derived based on random matrix perturbation theory ideas. We present theoretical analyses which (a) show that the proposed method achieves strong consistency (i.e., yields optimal solution as the number of data-points n → ∞), and (b) analyze conditions for exact dimension estimation in the finite n case. Using recent results, we show that our algorithm also yields near optimal PCA. The proposed method avoids forming the sample covariance matrix (associated with the data) explicitly and computing the complete eigen-decomposition. Therefore, the method is inexpensive, which is particularly advantageous in modern data applications where the covariance matrices can be very large. Numerical experiments illustrate the performance of the proposed method in various applications.

AB - High dimensional data and systems with many degrees of freedom are often characterized by covariance matrices. In this paper, we consider the problem of simultaneously estimating the dimension of the principal (dominant) subspace of these covariance matrices and obtaining an approximation to the subspace. This problem arises in the popular principal component analysis (PCA), and in many applications of machine learning, data analysis, signal and image processing, and others. We first present a novel method for estimating the dimension of the principal subspace. We then show how this method can be coupled with a Krylov subspace method to simultaneously estimate the dimension and obtain an approximation to the subspace. The dimension estimation is achieved at no additional cost. The proposed method operates on a model selection framework, where the novel selection criterion is derived based on random matrix perturbation theory ideas. We present theoretical analyses which (a) show that the proposed method achieves strong consistency (i.e., yields optimal solution as the number of data-points n → ∞), and (b) analyze conditions for exact dimension estimation in the finite n case. Using recent results, we show that our algorithm also yields near optimal PCA. The proposed method avoids forming the sample covariance matrix (associated with the data) explicitly and computing the complete eigen-decomposition. Therefore, the method is inexpensive, which is particularly advantageous in modern data applications where the covariance matrices can be very large. Numerical experiments illustrate the performance of the proposed method in various applications.

UR - http://www.scopus.com/inward/record.url?scp=85066091372&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85066091372&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85066091372

T3 - SIAM International Conference on Data Mining, SDM 2019

SP - 720

EP - 728

BT - SIAM International Conference on Data Mining, SDM 2019

PB - Society for Industrial and Applied Mathematics Publications

ER -