Graph Neural Networks as Neural Diffusion PDEs
A new post by Michael Bronstein about the connection of GNNs and differential equations that govern diffusion on graphs. This gives new mathematical framework for studying different architectures on graphs as well as a blueprint for developing new ones.
A new post by Michael Bronstein about the connection of GNNs and differential equations that govern diffusion on graphs. This gives new mathematical framework for studying different architectures on graphs as well as a blueprint for developing new ones.
Fresh picks from ArXiv
This week on ArXiv: life science package of DGL, efficient models for knowledge graphs, and explanation insights from tabular data 🤓
If I forgot to mention your paper, please shoot me a message and I will update the post.
Software
* DGL-LifeSci: An Open-Source Toolkit for Deep Learning on Graphs in Life Science
Embeddings
* Simple Truncated SVD based Model for Node Classification on Heterophilic Graphs
* Exploring the Representational Power of Graph Autoencoder
* NodePiece: Compositional and Parameter-Efficient Representations of Large Knowledge Graphs with Mikhail Galkin and William L. Hamilton
* A Deep Latent Space Model for Graph Representation Learning
Explanation
* Towards Automated Evaluation of Explanations in Graph Neural Networks
* Reimagining GNN Explanations with ideas from Tabular Data
Survey
* Graph and hypergraph colouring via nibble methods: A survey
This week on ArXiv: life science package of DGL, efficient models for knowledge graphs, and explanation insights from tabular data 🤓
If I forgot to mention your paper, please shoot me a message and I will update the post.
Software
* DGL-LifeSci: An Open-Source Toolkit for Deep Learning on Graphs in Life Science
Embeddings
* Simple Truncated SVD based Model for Node Classification on Heterophilic Graphs
* Exploring the Representational Power of Graph Autoencoder
* NodePiece: Compositional and Parameter-Efficient Representations of Large Knowledge Graphs with Mikhail Galkin and William L. Hamilton
* A Deep Latent Space Model for Graph Representation Learning
Explanation
* Towards Automated Evaluation of Explanations in Graph Neural Networks
* Reimagining GNN Explanations with ideas from Tabular Data
Survey
* Graph and hypergraph colouring via nibble methods: A survey
Speech recognition and Graph Transformer Networks
A lecture by Awni Hannun talks about low-resource speech recognition, beam search decoding, finite-state automate, and graph transformer networks.
A lecture by Awni Hannun talks about low-resource speech recognition, beam search decoding, finite-state automate, and graph transformer networks.
YouTube
11L – Speech recognition and Graph Transformer Networks
Course website: http://bit.ly/DLSP21-web
Playlist: http://bit.ly/DLSP21-YouTube
Speaker: Awni Hannun
Slides: https://bit.ly/DLSP21-11L
Chapters
00:00 – Guest lecturer introduction
01:10 – Outline
02:36 – Modern speech recognition
09:26 – Connectionist temporal…
Playlist: http://bit.ly/DLSP21-YouTube
Speaker: Awni Hannun
Slides: https://bit.ly/DLSP21-11L
Chapters
00:00 – Guest lecturer introduction
01:10 – Outline
02:36 – Modern speech recognition
09:26 – Connectionist temporal…
Recordings: Graph Neural Networks at CAIMS
Recordings of a session at CAIMS are now available. The topics are the cutting-edge research in the GNN world and is interesting if you want to see what researchers are currently working on in this space.
From local structures to size generalization in graph neural networks by Haggai Maron (NVIDIA)
On the generalization of graph neural networks and their applications to probabilistic inference by Renjie Liao (Google)
Graph convolution for semi-supervised classification: improved linear separability and out-of-distribution generalization by Kimon Fountoulakis (University of Waterloo)
Persistent message passing by Petar Veličković (DeepMind)
Recordings of a session at CAIMS are now available. The topics are the cutting-edge research in the GNN world and is interesting if you want to see what researchers are currently working on in this space.
From local structures to size generalization in graph neural networks by Haggai Maron (NVIDIA)
On the generalization of graph neural networks and their applications to probabilistic inference by Renjie Liao (Google)
Graph convolution for semi-supervised classification: improved linear separability and out-of-distribution generalization by Kimon Fountoulakis (University of Waterloo)
Persistent message passing by Petar Veličković (DeepMind)
Zoom
Video Conferencing, Web Conferencing, Webinars, Screen Sharing
Zoom is the leader in modern enterprise video communications, with an easy, reliable cloud platform for video and audio conferencing, chat, and webinars across mobile, desktop, and room systems. Zoom Rooms is the original software-based conference room solution…
On "On Graph Neural Networks versus Graph-Augmented MLPs"
There is a cool ICLR'21 paper "On Graph Neural Networks versus Graph-Augmented MLPs" by Lei Chen, Zhengdao Chen, and Joan Bruna, which studies a question I had in mind for some time: can we replace a graph with some statistics of graph, which we will later use with standard MLP, and not lose in quality?
The answer is that for graph-level task, such as graph isomorphism, we can indeed capture much of what's needed from the graph to solve graph isomorphism problem at the same level as WL test. However, for node-level tasks, there are provably less functions on nodes that graph-augmented MLPs can identify than GNNs.
Roughly, the reason is that GNNs process graph topology and node features at the same time, while graph-augmented MLPs first treat the graph topology and then process node features with MLP. So theoretically we lose expressive power when we use MLPs instead of GNNs on graph-structured data.
There is a cool ICLR'21 paper "On Graph Neural Networks versus Graph-Augmented MLPs" by Lei Chen, Zhengdao Chen, and Joan Bruna, which studies a question I had in mind for some time: can we replace a graph with some statistics of graph, which we will later use with standard MLP, and not lose in quality?
The answer is that for graph-level task, such as graph isomorphism, we can indeed capture much of what's needed from the graph to solve graph isomorphism problem at the same level as WL test. However, for node-level tasks, there are provably less functions on nodes that graph-augmented MLPs can identify than GNNs.
Roughly, the reason is that GNNs process graph topology and node features at the same time, while graph-augmented MLPs first treat the graph topology and then process node features with MLP. So theoretically we lose expressive power when we use MLPs instead of GNNs on graph-structured data.
OpenReview
On Graph Neural Networks versus Graph-Augmented MLPs
From the perspectives of expressive power and learning, this work compares multi-layer Graph Neural Networks (GNNs) with a simplified alternative that we call Graph-Augmented Multi-Layer...
GNN Applications
An overview presentation by Xavier Bresson about applications of GNNs, which include chip design, protein folding, autonomous driving, energy physics, and more.
An overview presentation by Xavier Bresson about applications of GNNs, which include chip design, protein folding, autonomous driving, energy physics, and more.
Dropbox
GNN_applications_Jun21.pdf
Shared with Dropbox
Connecting the Dots: Harness the Power of Graphs & ML
A new short book on graph ML that describes various algorithms on graphs and future challenges.
A new short book on graph ML that describes various algorithms on graphs and future challenges.
OpenCredo
Connect the Dots: Harness the Power of Graphs & ML Ebook - OpenCredo
Our e-book aims to shed light on what we believe is a real game-changer for those looking to improve upon simplistic answers sometimes arrived at by using traditional ML algorithms and approaches. We show how you are able to combine the power of both graphs…
Fresh picks from ArXiv
This week on ArXiv: WL to solve planar graphs, efficient molecule generation, and compressing graphs 🤐
If I forgot to mention your paper, please shoot me a message and I will update the post.
Math
Logarithmic Weisfeiler-Leman Identifies All Planar Graphs
GNNs
Curvature Graph Neural Network
Relational VAE: A Continuous Latent Variable Model for Graph Structured Data
GraphPiece: Efficiently Generating High-Quality Molecular Graph with Substructures
Privacy-Preserving Representation Learning on Graphs: A Mutual Information Perspective KDD 2021
Partition and Code: learning how to compress graphs with Andreas Loukas and Michael M. Bronstein
Evolving-Graph Gaussian Processes ICML Workshop 2021
This week on ArXiv: WL to solve planar graphs, efficient molecule generation, and compressing graphs 🤐
If I forgot to mention your paper, please shoot me a message and I will update the post.
Math
Logarithmic Weisfeiler-Leman Identifies All Planar Graphs
GNNs
Curvature Graph Neural Network
Relational VAE: A Continuous Latent Variable Model for Graph Structured Data
GraphPiece: Efficiently Generating High-Quality Molecular Graph with Substructures
Privacy-Preserving Representation Learning on Graphs: A Mutual Information Perspective KDD 2021
Partition and Code: learning how to compress graphs with Andreas Loukas and Michael M. Bronstein
Evolving-Graph Gaussian Processes ICML Workshop 2021
How to build E(n) Equivariant Normalizing Flows, for points with features?
A nice post that discusses how one can use normalizing flows and equivariant GNNs to generate realistic molecules.
A nice post that discusses how one can use normalizing flows and equivariant GNNs to generate realistic molecules.
Emiel Hoogeboom
How to build E(n) Equivariant Normalizing Flows, for points with features? | Emiel Hoogeboom
How to build E(n) Equivariant Normalizing Flows from our recent paper? We will discuss 1) Normalizing Flows 2) Continuous Time Normalizing Flows 3) E(n) GNNs, 4) Argmax Flows. Finally we talk about our 5) E(n) Flows. Most of these topics are tangential: if…
Graph Machine Learning research groups: Andreas Krause
I do a series of posts on the groups in graph research, previous post is here. The 31st is Andreas Krause, a professor at ETH Zurich and an advisor for Stefanie Jegelka.
Andreas Krause (~1982)
- Affiliation: ETH Zurich
- Education: Ph.D. at CMU in 2008 (advisor: Carlos Guestrin)
- h-index 81
- Interests: social network analysis, community detection, graphical models.
- Awards: Rossler Prize, best papers (AISTATS, AAAI, KDD, ICML)
I do a series of posts on the groups in graph research, previous post is here. The 31st is Andreas Krause, a professor at ETH Zurich and an advisor for Stefanie Jegelka.
Andreas Krause (~1982)
- Affiliation: ETH Zurich
- Education: Ph.D. at CMU in 2008 (advisor: Carlos Guestrin)
- h-index 81
- Interests: social network analysis, community detection, graphical models.
- Awards: Rossler Prize, best papers (AISTATS, AAAI, KDD, ICML)
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Johan Ugander
I do a series of posts on the groups in graph research, previous post is here. The 30th is Johan Ugander, a professor at Stanford, who was a post-doc at Microsoft Research Redmond 2014-2015 and held an…
I do a series of posts on the groups in graph research, previous post is here. The 30th is Johan Ugander, a professor at Stanford, who was a post-doc at Microsoft Research Redmond 2014-2015 and held an…
Fresh picks from ArXiv
This week on ArXiv: explanations of GNNs, generalization of GAE and generating natural proofs 👴
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNN
* Automated Graph Learning via Population Based Self-Tuning GCN
* Quantitative Evaluation of Explainable Graph Neural Networks for Molecular Property Prediction
* Robust Counterfactual Explanations on Graph Neural Networks
* Probabilistic Graph Reasoning for Natural Proof Generation
* On Generalization of Graph Autoencoders with Adversarial Training
* Private Graph Data Release: A Survey
This week on ArXiv: explanations of GNNs, generalization of GAE and generating natural proofs 👴
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNN
* Automated Graph Learning via Population Based Self-Tuning GCN
* Quantitative Evaluation of Explainable Graph Neural Networks for Molecular Property Prediction
* Robust Counterfactual Explanations on Graph Neural Networks
* Probabilistic Graph Reasoning for Natural Proof Generation
* On Generalization of Graph Autoencoders with Adversarial Training
* Private Graph Data Release: A Survey
GNN User Group Meeting videos (June)
Video from the June meeting of GNN user group that includes talks about binary GNNs and dynamic graph models by Mahdi Saleh and
and about simplifying large-scale visual analysis of tricky data & models with GPUs, graphs, and ML by Leo Meyerovich.
Video from the June meeting of GNN user group that includes talks about binary GNNs and dynamic graph models by Mahdi Saleh and
and about simplifying large-scale visual analysis of tricky data & models with GPUs, graphs, and ML by Leo Meyerovich.
YouTube
Graph Neural Networks User Group Meeting on June 24th, 2021
Agenda 6/24/2021:
• 4:00 - 4:30 (PST): Binary Graph Neural Networks and Dynamic Graph Models (Mehdi Bahri, Imperial College London).
• 4:30 - 5:00 (PST): Simplifying large-scale visual analysis of tricky data & models with GPUs, graphs, and ML (Leo Meyerovich…
• 4:00 - 4:30 (PST): Binary Graph Neural Networks and Dynamic Graph Models (Mehdi Bahri, Imperial College London).
• 4:30 - 5:00 (PST): Simplifying large-scale visual analysis of tricky data & models with GPUs, graphs, and ML (Leo Meyerovich…
Speeding Up the Webcola Graph Viz Library with Rust + WebAssembly
A captivating story about optimizing visualization of graphs in the browser. The code can be found here. Here is a performance comparison of different browser visualization libraries. And here is another efficient library for plotting graphs in a browser.
A captivating story about optimizing visualization of graphs in the browser. The code can be found here. Here is a performance comparison of different browser visualization libraries. And here is another efficient library for plotting graphs in a browser.
Casey Primozic's Blog
Speeding Up the Webcola Graph Viz Library with Rust + WebAssembly
LOGML Videos
LOGML is an exciting summer school with projects and talks about graph ML happening this week. A collection of videos that includes presentations of the cutting edge research as well as industrial applications from leading companies are available now for everyone.
LOGML is an exciting summer school with projects and talks about graph ML happening this week. A collection of videos that includes presentations of the cutting edge research as well as industrial applications from leading companies are available now for everyone.
www.logml.ai
LOGML 2025
London Geometry and Machine Learning Summer School, July 7-11 2025
Effortless Distributed Training of Ultra-Wide GCNs
A great post about distributed training of GNNs on large graphs. The architecture splits the GNN into several submodules where each is trained independently on separate GPUs, providing the flexibility to increase significantly the hidden dimension of embeddings. As such this approach is GCN model agnostic, compatible with existing sampling methods, and performs the best in very large graphs.
A great post about distributed training of GNNs on large graphs. The architecture splits the GNN into several submodules where each is trained independently on separate GPUs, providing the flexibility to increase significantly the hidden dimension of embeddings. As such this approach is GCN model agnostic, compatible with existing sampling methods, and performs the best in very large graphs.
Medium
Effortless Distributed Training of Ultra-Wide GCNs
An overview of GIST, a novel distributed training framework for large-scale GCNs.
Fresh picks from ArXiv
This week on ArXiv: QA in images, graph matching, and learning robot dynamics 🤖
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Train on Small, Play the Large: Scaling Up Board Games with AlphaZero and GNN
* Reasoning-Modulated Representations with Petar Veličković and Thomas Kipf
* SENSORIMOTOR GRAPH: Action-Conditioned Graph Neural Network for Learning Robotic Soft Hand Dynamics
* Scalable Optimal Transport in High Dimensions for Graph Distances, Embedding Alignment, and More with Stephan Günnemann
* Graphhopper: Multi-Hop Scene Graph Reasoning for Visual Question Answering with Stephan Günnemann
* Elastic Graph Neural Networks
Survey
* A Survey of Knowledge Graph Embedding and Their Applications
This week on ArXiv: QA in images, graph matching, and learning robot dynamics 🤖
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Train on Small, Play the Large: Scaling Up Board Games with AlphaZero and GNN
* Reasoning-Modulated Representations with Petar Veličković and Thomas Kipf
* SENSORIMOTOR GRAPH: Action-Conditioned Graph Neural Network for Learning Robotic Soft Hand Dynamics
* Scalable Optimal Transport in High Dimensions for Graph Distances, Embedding Alignment, and More with Stephan Günnemann
* Graphhopper: Multi-Hop Scene Graph Reasoning for Visual Question Answering with Stephan Günnemann
* Elastic Graph Neural Networks
Survey
* A Survey of Knowledge Graph Embedding and Their Applications
Graph Papers at ICML 2021
ICML 2021 is happening this week and here is a list of all relevant graph papers that you can encounter there. There are papers on improving expressiveness, explainability, robustness, normalization and theory.
ICML 2021 is happening this week and here is a list of all relevant graph papers that you can encounter there. There are papers on improving expressiveness, explainability, robustness, normalization and theory.
Awesome Explainable Graph Reasoning
An awesome collection of research papers and software related to explainability in graph machine learning, provided by AstraZeneca. It covers papers on explainable predictions and reasoning, libraries, and survey papers.
An awesome collection of research papers and software related to explainability in graph machine learning, provided by AstraZeneca. It covers papers on explainable predictions and reasoning, libraries, and survey papers.
GitHub
GitHub - AstraZeneca/awesome-explainable-graph-reasoning: A collection of research papers and software related to explainability…
A collection of research papers and software related to explainability in graph machine learning. - AstraZeneca/awesome-explainable-graph-reasoning
labml.ai Annotated PyTorch Paper Implementations
A very cool collection of popular deep learning blocks, nicely formatted in the browser with extensive comments. Among others there is a GAT implementation.
A very cool collection of popular deep learning blocks, nicely formatted in the browser with extensive comments. Among others there is a GAT implementation.
Interpretable Deep Learning for New Physics Discovery
In this video, Miles Cranmer (Princeton) discusses a method for converting a neural network into an analytic equation using a particular set of inductive biases. The technique relies on a sparsification of latent spaces in a deep neural network, followed by symbolic regression. In their paper, they demonstrate that they can recover physical laws for various simple and complex systems. For example, they discover gravity along with planetary masses from data; they learn a technique for doing cosmology with cosmic voids and dark matter halos; and they show how to extract the Euler equation from a graph neural network trained on turbulence data.
In this video, Miles Cranmer (Princeton) discusses a method for converting a neural network into an analytic equation using a particular set of inductive biases. The technique relies on a sparsification of latent spaces in a deep neural network, followed by symbolic regression. In their paper, they demonstrate that they can recover physical laws for various simple and complex systems. For example, they discover gravity along with planetary masses from data; they learn a technique for doing cosmology with cosmic voids and dark matter halos; and they show how to extract the Euler equation from a graph neural network trained on turbulence data.
YouTube
Interpretable Deep Learning for New Physics Discovery
In this video, Miles Cranmer discusses a method for converting a neural network into an analytic equation using a particular set of inductive biases. The technique relies on a sparsification of latent spaces in a deep neural network, followed by symbolic…