Generalized thresholding of large covariance matrices

Adam J. Rothman, Elizaveta Levina, Ji Zhu

Research output: Contribution to journalArticle

194 Scopus citations

Abstract

We propose a new class of generalized thresholding operators that combine thresholding with shrinkage, and study generalized thresholding of the sample covariance matrix in high dimensions. Generalized thresholding of the covariance matrix has good theoretical properties and carries almost no computational burden. We obtain an explicit convergence rate in the operator norm that shows the tradeoff between the sparsity of the true model, dimension, and the sample size, and shows that generalized thresholding is consistent over a large class of models as long as the dimension p and the sample size n satisfy log p/n → O. In addition, we show that generalized thresholding has the "sparsistency" property, meaning it estimates true zeros as zeros with probability tending to 1, and, under an additional mild condition, is sign consistent for nonzero elements. We show that generalized thresholding covers, as special cases, hard and soft thresholding, smoothly clipped absolute deviation, and adaptive lasso, and compare different types of generalized thresholding in a simulation study and in an example of gene clustering from a microarray experiment with tumor tissues.

Original languageEnglish (US)
Pages (from-to)177-186
Number of pages10
JournalJournal of the American Statistical Association
Volume104
Issue number485
DOIs
StatePublished - Mar 1 2009

Keywords

  • Covariance
  • High-dimensional data
  • Regularization
  • Sparsity
  • Thresholding

Fingerprint Dive into the research topics of 'Generalized thresholding of large covariance matrices'. Together they form a unique fingerprint.

  • Cite this