Fitting Low-rank Models on Egocentrically Sampled Partial Networks

Ga Ming Angus Chan, Tianxi Li

Research output: Contribution to journalConference articlepeer-review

Abstract

The statistical modeling of random networks has been widely used to uncover interaction mechanisms in complex systems and to predict unobserved links in real-world networks. In many applications, network connections are collected via egocentric sampling: a subset of nodes is sampled first, after which all links involving this subset are recorded; all other information is missing. Compared with the assumption of “uniformly missing at random”, egocentrically sampled partial networks require specially designed modeling strategies. Current statistical methods are either computationally infeasible or based on intuitive designs without theoretical justification. Here, we propose an approach to fit general low-rank models for egocentrically sampled networks, which include several popular network models. This method is based on graph spectral properties and is computationally efficient for large-scale networks. It results in consistent recovery of missing subnetworks due to egocentric sampling for sparse networks. To our knowledge, this method offers the first theoretical guarantee for egocentric partial network estimation in the scope of low-rank models. We evaluate the technique on several synthetic and real-world networks and show that it delivers competitive performance in link prediction tasks.

Original languageEnglish (US)
Pages (from-to)10635-10649
Number of pages15
JournalProceedings of Machine Learning Research
Volume206
StatePublished - 2023
Externally publishedYes
Event26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023 - Valencia, Spain
Duration: Apr 25 2023Apr 27 2023

Bibliographical note

Publisher Copyright:
Copyright © 2023 by the author(s)

Fingerprint

Dive into the research topics of 'Fitting Low-rank Models on Egocentrically Sampled Partial Networks'. Together they form a unique fingerprint.

Cite this