This article proposes a penalized likelihood method to jointly estimate multiple precision matrices for use in quadratic discriminant analysis (QDA) and model-based clustering. We use a ridge penalty and a ridge fusion penalty to introduce shrinkage and promote similarity between precision matrix estimates. We use blockwise coordinate descent for optimization, and validation likelihood is used for tuning parameter selection. Our method is applied in QDA and semi-supervised model-based clustering.
Bibliographical notePublisher Copyright:
© 2015, © American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.
- Discriminant analysis
- Joint inverse covariance matrix estimation
- Model-based clustering
- Semi-supervised learning