Nimble GNN Embedding with Tensor-Train Decomposition

Chunxing Yin, Da Zheng, Israt Nisa, Christos Faloutsos, George Karypis, Richard Vuduc

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

This paper describes a new method for representing embedding tables of graph neural networks (GNNs) more compactly via tensor-train (TT) decomposition. We consider the scenario where (a) the graph data that lack node features, thereby requiring the learning of embeddings during training; and (b) we wish to exploit GPU platforms, where smaller tables are needed to reduce host-to-GPU communication even for large-memory GPUs. The use of TT enables a compact parameterization of the embedding, rendering it small enough to fit entirely on modern GPUs even for massive graphs. When combined with judicious schemes for initialization and hierarchical graph partitioning, this approach can reduce the size of node embedding vectors by 1,659 times to 81,362 times on large publicly available benchmark datasets, achieving comparable or better accuracy and significant speedups on multi-GPU systems. In some cases, our model without explicit node features on input can even match the accuracy of models that use node features.

Original languageEnglish (US)
Title of host publicationKDD 2022 - Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
PublisherAssociation for Computing Machinery
Pages2327-2335
Number of pages9
ISBN (Electronic)9781450393850
DOIs
StatePublished - Aug 14 2022
Event28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2022 - Washington, United States
Duration: Aug 14 2022Aug 18 2022

Publication series

NameProceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining

Conference

Conference28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2022
Country/TerritoryUnited States
CityWashington
Period8/14/228/18/22

Bibliographical note

Publisher Copyright:
© 2022 ACM.

Keywords

  • embedding
  • graph neural networks
  • tensor-train decomposition

Fingerprint

Dive into the research topics of 'Nimble GNN Embedding with Tensor-Train Decomposition'. Together they form a unique fingerprint.

Cite this