On the one hand, we have sheaves, which are often described in a complex way and have applications in abstract areas such as Algebraic Topology, Algebraic Geometry, and Categorical Logic. On the other hand, we have Graph Neural Networks (GNNs), which are widely used in machine learning for graph-structured data, with real-world applications in molecular modeling, recommender systems, and more.
In this talk, we will present sheaves on a fixed graph, which admit a simple characterization known as cellular sheaves, and discuss the advantages of using them to improve GNN models, where we have to generalize the graph Laplacian to the Hodge Laplacian on graphs. We will explore two situations where cellular sheaves are particularly useful: when the input graphs exhibit “heterophilic” characteristics (i.e., graphs in which connected nodes do not share similar attributes), and when we want to avoid the phenomenon where every node converges to the same representation (oversmoothing). These ideas were studied in the 2022 paper "Neural Sheaf Diffusion: A Topological Perspective on Heterophily and Oversmoothing in GNNs" by Cristian Bodnar et al.
No prior knowledge of Category Theory or Neural Networks will be assumed. The goal is to present a very recent application of sheaf theory in machine learning to a broad audience.