Abstract
Using convex optimization, we construct a sparse estimator of the covariance matrix that is positive definite and performs well in high-dimensional settings. A lasso-type penalty is used to encourage sparsity and a logarithmic barrier function is used to enforce positive definiteness. Consistency and convergence rate bounds are established as both the number of variables and sample size diverge. An efficient computational algorithm is developed and the merits of the approach are illustrated with simulations and a speech signal classification example.
Original language | English (US) |
---|---|
Pages (from-to) | 733-740 |
Number of pages | 8 |
Journal | Biometrika |
Volume | 99 |
Issue number | 3 |
DOIs | |
State | Published - Sep 2012 |
Bibliographical note
Funding Information:The author thanks the editor, associate editor, and referees for helpful suggestions, as well as Tiefeng Jiang for a helpful discussion. The author’s research is supported in part by a grant from the U.S. National Science Foundation.
Keywords
- Barrier function
- Classification
- Convex optimization
- High-dimensional data
- Sparsity