Abstract
We propose a penalized likelihood method to fit the linear discriminant analysis model when the predictor is matrix valued. We simultaneously estimate the means and the precision matrix, which we assume has a Kronecker product decomposition. Our penalties encourage pairs of response category mean matrix estimators to have equal entries and also encourage zeros in the precision matrix estimator. To compute our estimators, we use a blockwise coordinate descent algorithm. To update the optimization variables corresponding to response category mean matrices, we use an alternating minimization algorithm that takes advantage of the Kronecker structure of the precision matrix. We show that our method can outperform relevant competitors in classification, even when our modeling assumptions are violated. We analyze three real datasets to demonstrate our method’s applicability. Supplementary materials, including an R package implementing our method, are available online.
Original language | English (US) |
---|---|
Pages (from-to) | 11-22 |
Number of pages | 12 |
Journal | Journal of Computational and Graphical Statistics |
Volume | 28 |
Issue number | 1 |
DOIs | |
State | Published - Jan 2 2019 |
Bibliographical note
Funding Information:This research was supported in part by the Doctoral Dissertation Fellowship from the University of Minnesota and the National Science Foundation grant DMS-1452068. The authors thank the associate editor and referees for helpful comments.
Funding Information:
This research was supported in part by the Doctoral Dissertation Fellow-shipfromthe UniversityofMinnesotaandthe NationalScienceFoundation grant DMS-1452068.
Publisher Copyright:
© 2019, © 2019 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.
Keywords
- Alternating minimization algorithm
- Classification
- Penalized likelihood