Beyond Message Passing: a Physics-Inspired Paradigm for Graph Neural Networks
On going beyond message-passing based graph neural networks with physics-inspired “continuous” learning models
Beyond Message Passing: a Physics-Inspired Paradigm for Graph Neural Networks
Preview:
The message-passing paradigm has been the “battle horse” of deep learning on graphs for several years, making graph neural networks a big success in a wide range of applications, from particle physics to protein design. From a theoretical viewpoint, it established the link to the Weisfeiler-Lehman hierarchy, allowing to analyse the expressive power of GNNs. We argue that the “node and edge-centric” mindset of current graph deep learning schemes imposes strong limitations that hinder future progress in the field. As an alternative, we propose physics-inspired “continuous” learning models that open up a new trove of tools from the fields of differential geometry, algebraic topology, and differential equations so far largely unexplored in graph ML.