PropInit: Scalable Inductive Initialization for Heterogeneous Graph Neural Networks

Soji Adeshina, Jian Zhang, Muhyun Kim, Min Chen, Rizal Fathony, Advitiya Vashisht, Jia Chen, George Karypis

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Graph Neural Networks (GNNs) require that all nodes have initial representations which are usually derived from the node features. When the node features are absent, GNNs can learn node embeddings with an embedding layer or use pre-trained network embeddings for the initial node representations. However, these approaches are limited because i) they cannot be easily extended to initialize new nodes that are added to the graph for inference after training and ii) they are memory intensive and store a fixed representation for every node in the graph. In this work, we present PropInit a scalable node representation initialization method for training GNNs and other Graph Machine Learning (ML) models on heterogeneous graphs where some or all node types have no natural features. Unlike existing methods that learn a fixed embedding vector for each node, PropInit learns an inductive function that leverages the metagraph to initialize node representations. As a result, PropInit is fully inductive and can be applied, without retraining, to new nodes without features that are added to the graph. PropInit also scales to large graphs as it requires only a small fraction of the memory requirements of existing methods. On public benchmark heterogeneous graph datasets, using various GNN models, PropInit achieves comparable or better performance to other competing approaches while needing only 0.01% to 2% of their memory consumption for representing node embeddings. We also demonstrate PropInit's effectiveness on an industry heterogeneous graph dataset for fraud detection and achieve better classification accuracy than learning full embeddings while reducing the embedding memory footprint during training and inference by 99.99%

Original languageEnglish (US)
Title of host publicationProceedings - 13th IEEE International Conference on Knowledge Graph, ICKG 2022
EditorsPeipei Li, Kui Yu, Nitesh Chawla, Ronen Feldman, Qing Li, Xindong Wu
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages8
ISBN (Electronic)9781665451017
StatePublished - 2022
Event13th IEEE International Conference on Knowledge Graph, ICKG 2022 - Virtual, Online, United States
Duration: Nov 30 2022Dec 1 2022

Publication series

NameProceedings - 13th IEEE International Conference on Knowledge Graph, ICKG 2022


Conference13th IEEE International Conference on Knowledge Graph, ICKG 2022
Country/TerritoryUnited States
CityVirtual, Online

Bibliographical note

Publisher Copyright:
© 2022 IEEE.


  • graph neural network
  • heterogeneous graphs
  • inductive learning
  • network embedding
  • scalability
  • semi-supervised learning


Dive into the research topics of 'PropInit: Scalable Inductive Initialization for Heterogeneous Graph Neural Networks'. Together they form a unique fingerprint.

Cite this