Fundamental Limits of Distributed Covariance Matrix Estimation Under Communication Constraints

Mohammad Reza Rahmani, Mohammad Hossein Yassaee, Mohammad Ali Maddah-Ali, Mohammad Reza Aref

Research output: Contribution to journalConference articlepeer-review

Abstract

Estimating high-dimensional covariance matrices is crucial in various domains. This work considers a scenario where two collaborating agents access disjoint dimensions of m samples from a high-dimensional random vector, and they can only communicate a limited number of bits to a central server, which wants to accurately approximate the covariance matrix. We analyze the fundamental trade-off between communication cost, number of samples, and estimation accuracy. We prove a lower bound on the error achievable by any estimator, highlighting the impact of dimensions, number of samples, and communication budget. Furthermore, we present an algorithm that achieves this lower bound up to a logarithmic factor, demonstrating its near-optimality in practical settings.

Original languageEnglish (US)
Pages (from-to)41927-41958
Number of pages32
JournalProceedings of Machine Learning Research
Volume235
StatePublished - 2024
Event41st International Conference on Machine Learning, ICML 2024 - Vienna, Austria
Duration: Jul 21 2024Jul 27 2024

Bibliographical note

Publisher Copyright:
Copyright 2024 by the author(s)

Fingerprint

Dive into the research topics of 'Fundamental Limits of Distributed Covariance Matrix Estimation Under Communication Constraints'. Together they form a unique fingerprint.

Cite this