GNNIE: GNN Inference Engine with Load-balancing and Graph-specific Caching

Sudipta Mondal, Susmita Dey Manasi, Kishor Kunal, S. Ramprasath, Sachin S. Sapatnekar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Graph neural networks (GNN) inferencing involves weighting vertex feature vectors, followed by aggregating weighted vectors over a vertex neighborhood. High and variable sparsity in the input vertex feature vectors, and high sparsity and power-law degree distributions in the adjacency matrix, can lead to (a) unbalanced loads and (b) inefficient random memory accesses. GNNIE ensures load-balancing by splitting features into blocks, proposing a flexible MAC architecture, and employing load (re)distribution. GNNIE's novel caching scheme bypasses the high costs of random DRAM accesses. GNNIE shows high speedups over CPUs/GPUs; it is faster and runs a broader range of GNNs than existing accelerators.

Original languageEnglish (US)
Title of host publicationProceedings of the 59th ACM/IEEE Design Automation Conference, DAC 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages565-570
Number of pages6
ISBN (Electronic)9781450391429
DOIs
StatePublished - Jul 10 2022
Event59th ACM/IEEE Design Automation Conference, DAC 2022 - San Francisco, United States
Duration: Jul 10 2022Jul 14 2022

Publication series

NameProceedings of the 59th ACM/IEEE Design Automation Conference

Conference

Conference59th ACM/IEEE Design Automation Conference, DAC 2022
Country/TerritoryUnited States
CitySan Francisco
Period7/10/227/14/22

Bibliographical note

Funding Information:
This work was supported in part by the Semiconductor Research Corporation (SRC).

Publisher Copyright:
© 2022 ACM.

Keywords

  • GNN
  • graph-specific caching
  • hardware accelerator
  • load balancing

Fingerprint

Dive into the research topics of 'GNNIE: GNN Inference Engine with Load-balancing and Graph-specific Caching'. Together they form a unique fingerprint.

Cite this