Abstract
We consider the problem of learning high-dimensional Gaussian graphical models. The graphical lasso is one of the most popular methods for estimating Gaussian graphical models. However, it does not achieve the oracle rate of convergence. In this paper, we propose the graphical nonconvex optimization for optimal estimation in Gaussian graphical models, which is then approximated by a sequence of adaptive convex programs. Our proposal is computationally tractable and produces an estimator that achieves the oracle rate of convergence. The statistical error introduced by the sequential approximation is clearly demonstrated via a contraction property. The proposed methodology is then extended to modeling semiparametric graphical models. We show via numerical studies that the proposed estimator outperforms other popular methods for estimating Gaussian graphical models.
Original language | English (US) |
---|---|
Title of host publication | 35th International Conference on Machine Learning, ICML 2018 |
Editors | Jennifer Dy, Andreas Krause |
Publisher | International Machine Learning Society (IMLS) |
Pages | 7638-7645 |
Number of pages | 8 |
ISBN (Electronic) | 9781510867963 |
State | Published - 2018 |
Event | 35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden Duration: Jul 10 2018 → Jul 15 2018 |
Publication series
Name | 35th International Conference on Machine Learning, ICML 2018 |
---|---|
Volume | 11 |
Other
Other | 35th International Conference on Machine Learning, ICML 2018 |
---|---|
Country/Territory | Sweden |
City | Stockholm |
Period | 7/10/18 → 7/15/18 |
Bibliographical note
Funding Information:We thank all three reviewers for their insightful comments. Qiang Sun is supported by Connaught New Researcher Award, NSERC Grant RGPIN-2018-06484. Tong Zhang is supported by NSFIIS1407939.
Publisher Copyright:
© 2018 by the Authors All rights reserved.