A Multicore GNN Training Accelerator

Sudipta Mondal, S. Ramprasath, Ziqing Zeng, Kishor Kunal, Sachin S. Sapatnekar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Graph neural networks (GNN) are vital for analytics on real-world problems with graph models. This work develops a multicore GNN training accelerator and develops multicore-specific optimizations for superior performance. It uses enhanced multicore-specific dynamic caching to circumvent the costs of irregular DRAM access patterns of graph-structured data. A novel feature vector segmentation approach is used to maximize on-chip data reuse with high on-chip computation per memory access, reducing data access latency, using a machine learning model for optimal performance. The work presents a major advance over prior FPGA/ASIC GNN accelerators by handling significantly larger datasets (with up to 8.6M vertices) on a variety of GNN models. On average, training speedup of 17× and energy efficiency improvement of 322× is achieved over DGL on a GPU; a speedup of 14× with 268× lower energy is shown over GPU-based GNNAdvisor; and 11× and 24× speedups are obtained over ASIC-based Rubik and FPGA-based GraphACT.

Original languageEnglish (US)
Title of host publication2023 IEEE/ACM International Symposium on Low Power Electronics and Design, ISLPED 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798350311754
DOIs
StatePublished - 2023
Event2023 IEEE/ACM International Symposium on Low Power Electronics and Design, ISLPED 2023 - Vienna, Austria
Duration: Aug 7 2023Aug 8 2023

Publication series

Name2023 IEEE/ACM International Symposium on Low Power Electronics and Design (ISLPED)

Conference

Conference2023 IEEE/ACM International Symposium on Low Power Electronics and Design, ISLPED 2023
Country/TerritoryAustria
CityVienna
Period8/7/238/8/23

Bibliographical note

Publisher Copyright:
© 2023 IEEE.

Fingerprint

Dive into the research topics of 'A Multicore GNN Training Accelerator'. Together they form a unique fingerprint.

Cite this