GRAPH CONVOLUTIONAL NETWORKS FROM THE PERSPECTIVE OF SHEAVES AND THE NEURAL TANGENT KERNEL

Thomas Gebhart

    Research output: Contribution to journalConference articlepeer-review

    Abstract

    Graph convolutional networks are a popular class of deep neural network algorithms which have shown success in a number of relational learning tasks. Despite their success, graph convolutional networks exhibit a number of peculiar features, including a bias towards learning oversmoothed and homophilic functions, which are not easily diagnosed due to the complex nature of these algorithms. We propose to bridge this gap in understanding by studying the neural tangent kernel of sheaf convolutional networks-a topological generalization of graph convolutional networks. To this end, we derive a parameterization of the neural tangent kernel for sheaf convolutional networks which separates the function into two parts: one driven by a forward diffusion process determined by the graph, and the other determined by the composite effect of nodes' activations on the output layer. This geometrically-focused derivation produces a number of immediate insights which we discuss in detail.

    Original languageEnglish (US)
    Pages (from-to)124-132
    Number of pages9
    JournalProceedings of Machine Learning Research
    Volume196
    StatePublished - 2022
    EventICML Workshop on Topology, Algebra, and Geometry in Machine Learning, TAG:ML 2022 - Virtual, Online, United States
    Duration: Jul 20 2022 → …

    Bibliographical note

    Publisher Copyright:
    © 2022 Proceedings of Machine Learning Research. All rights reserved.

    Fingerprint

    Dive into the research topics of 'GRAPH CONVOLUTIONAL NETWORKS FROM THE PERSPECTIVE OF SHEAVES AND THE NEURAL TANGENT KERNEL'. Together they form a unique fingerprint.

    Cite this