### Abstract

Many machine learning and data-related applications require the knowledge of approximate ranks of large data matrices at hand. This letter presents two computationally inexpensive techniques to estimate the approximate ranks of such matrices. These techniques exploit approximate spectral densities, popular in physics, which are probability density distributions that measure the likelihood of finding eigenvalues of the matrix at a given point on the real line. Integrating the spectral density over an interval gives the eigenvalue count of the matrix in that interval. Therefore, the rank can be approximated by integrating the spectral density over a carefully selected interval. Two different approaches are discussed to estimate the approximate rank, one based on Chebyshev polynomials and the other based on the Lanczos algorithm. In order to obtain the appropriate interval, it is necessary to locate a gap between the eigenvalues that correspond to noise and the relevant eigenvalues that contribute to the matrix rank. A method for locating this gap and selecting the interval of integration is proposed based on the plot of the spectral density. Numerical experiments illustrate the performance of these techniques on matrices from typical applications.

Original language | English (US) |
---|---|

Pages (from-to) | 1317-1351 |

Number of pages | 35 |

Journal | Neural computation |

Volume | 29 |

Issue number | 5 |

DOIs | |

State | Published - May 1 2017 |

## Fingerprint Dive into the research topics of 'Fast estimation of approximate matrix ranks using spectral densities'. Together they form a unique fingerprint.

## Cite this

*Neural computation*,

*29*(5), 1317-1351. https://doi.org/10.1162/NECO_a_00951