Fresh picks from ArXiv
This week on ArXiv: explanations of GNNs, generalization of GAE and generating natural proofs 👴
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNN
* Automated Graph Learning via Population Based Self-Tuning GCN
* Quantitative Evaluation of Explainable Graph Neural Networks for Molecular Property Prediction
* Robust Counterfactual Explanations on Graph Neural Networks
* Probabilistic Graph Reasoning for Natural Proof Generation
* On Generalization of Graph Autoencoders with Adversarial Training
* Private Graph Data Release: A Survey
This week on ArXiv: explanations of GNNs, generalization of GAE and generating natural proofs 👴
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNN
* Automated Graph Learning via Population Based Self-Tuning GCN
* Quantitative Evaluation of Explainable Graph Neural Networks for Molecular Property Prediction
* Robust Counterfactual Explanations on Graph Neural Networks
* Probabilistic Graph Reasoning for Natural Proof Generation
* On Generalization of Graph Autoencoders with Adversarial Training
* Private Graph Data Release: A Survey
GNN User Group Meeting videos (June)
Video from the June meeting of GNN user group that includes talks about binary GNNs and dynamic graph models by Mahdi Saleh and
and about simplifying large-scale visual analysis of tricky data & models with GPUs, graphs, and ML by Leo Meyerovich.
Video from the June meeting of GNN user group that includes talks about binary GNNs and dynamic graph models by Mahdi Saleh and
and about simplifying large-scale visual analysis of tricky data & models with GPUs, graphs, and ML by Leo Meyerovich.
YouTube
Graph Neural Networks User Group Meeting on June 24th, 2021
Agenda 6/24/2021:
• 4:00 - 4:30 (PST): Binary Graph Neural Networks and Dynamic Graph Models (Mehdi Bahri, Imperial College London).
• 4:30 - 5:00 (PST): Simplifying large-scale visual analysis of tricky data & models with GPUs, graphs, and ML (Leo Meyerovich…
• 4:00 - 4:30 (PST): Binary Graph Neural Networks and Dynamic Graph Models (Mehdi Bahri, Imperial College London).
• 4:30 - 5:00 (PST): Simplifying large-scale visual analysis of tricky data & models with GPUs, graphs, and ML (Leo Meyerovich…
Speeding Up the Webcola Graph Viz Library with Rust + WebAssembly
A captivating story about optimizing visualization of graphs in the browser. The code can be found here. Here is a performance comparison of different browser visualization libraries. And here is another efficient library for plotting graphs in a browser.
A captivating story about optimizing visualization of graphs in the browser. The code can be found here. Here is a performance comparison of different browser visualization libraries. And here is another efficient library for plotting graphs in a browser.
Casey Primozic's Blog
Speeding Up the Webcola Graph Viz Library with Rust + WebAssembly
LOGML Videos
LOGML is an exciting summer school with projects and talks about graph ML happening this week. A collection of videos that includes presentations of the cutting edge research as well as industrial applications from leading companies are available now for everyone.
LOGML is an exciting summer school with projects and talks about graph ML happening this week. A collection of videos that includes presentations of the cutting edge research as well as industrial applications from leading companies are available now for everyone.
www.logml.ai
LOGML 2025
London Geometry and Machine Learning Summer School, July 7-11 2025
Effortless Distributed Training of Ultra-Wide GCNs
A great post about distributed training of GNNs on large graphs. The architecture splits the GNN into several submodules where each is trained independently on separate GPUs, providing the flexibility to increase significantly the hidden dimension of embeddings. As such this approach is GCN model agnostic, compatible with existing sampling methods, and performs the best in very large graphs.
A great post about distributed training of GNNs on large graphs. The architecture splits the GNN into several submodules where each is trained independently on separate GPUs, providing the flexibility to increase significantly the hidden dimension of embeddings. As such this approach is GCN model agnostic, compatible with existing sampling methods, and performs the best in very large graphs.
Medium
Effortless Distributed Training of Ultra-Wide GCNs
An overview of GIST, a novel distributed training framework for large-scale GCNs.
Fresh picks from ArXiv
This week on ArXiv: QA in images, graph matching, and learning robot dynamics 🤖
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Train on Small, Play the Large: Scaling Up Board Games with AlphaZero and GNN
* Reasoning-Modulated Representations with Petar Veličković and Thomas Kipf
* SENSORIMOTOR GRAPH: Action-Conditioned Graph Neural Network for Learning Robotic Soft Hand Dynamics
* Scalable Optimal Transport in High Dimensions for Graph Distances, Embedding Alignment, and More with Stephan Günnemann
* Graphhopper: Multi-Hop Scene Graph Reasoning for Visual Question Answering with Stephan Günnemann
* Elastic Graph Neural Networks
Survey
* A Survey of Knowledge Graph Embedding and Their Applications
This week on ArXiv: QA in images, graph matching, and learning robot dynamics 🤖
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Train on Small, Play the Large: Scaling Up Board Games with AlphaZero and GNN
* Reasoning-Modulated Representations with Petar Veličković and Thomas Kipf
* SENSORIMOTOR GRAPH: Action-Conditioned Graph Neural Network for Learning Robotic Soft Hand Dynamics
* Scalable Optimal Transport in High Dimensions for Graph Distances, Embedding Alignment, and More with Stephan Günnemann
* Graphhopper: Multi-Hop Scene Graph Reasoning for Visual Question Answering with Stephan Günnemann
* Elastic Graph Neural Networks
Survey
* A Survey of Knowledge Graph Embedding and Their Applications
Graph Papers at ICML 2021
ICML 2021 is happening this week and here is a list of all relevant graph papers that you can encounter there. There are papers on improving expressiveness, explainability, robustness, normalization and theory.
ICML 2021 is happening this week and here is a list of all relevant graph papers that you can encounter there. There are papers on improving expressiveness, explainability, robustness, normalization and theory.
Awesome Explainable Graph Reasoning
An awesome collection of research papers and software related to explainability in graph machine learning, provided by AstraZeneca. It covers papers on explainable predictions and reasoning, libraries, and survey papers.
An awesome collection of research papers and software related to explainability in graph machine learning, provided by AstraZeneca. It covers papers on explainable predictions and reasoning, libraries, and survey papers.
GitHub
GitHub - AstraZeneca/awesome-explainable-graph-reasoning: A collection of research papers and software related to explainability…
A collection of research papers and software related to explainability in graph machine learning. - AstraZeneca/awesome-explainable-graph-reasoning
labml.ai Annotated PyTorch Paper Implementations
A very cool collection of popular deep learning blocks, nicely formatted in the browser with extensive comments. Among others there is a GAT implementation.
A very cool collection of popular deep learning blocks, nicely formatted in the browser with extensive comments. Among others there is a GAT implementation.
Interpretable Deep Learning for New Physics Discovery
In this video, Miles Cranmer (Princeton) discusses a method for converting a neural network into an analytic equation using a particular set of inductive biases. The technique relies on a sparsification of latent spaces in a deep neural network, followed by symbolic regression. In their paper, they demonstrate that they can recover physical laws for various simple and complex systems. For example, they discover gravity along with planetary masses from data; they learn a technique for doing cosmology with cosmic voids and dark matter halos; and they show how to extract the Euler equation from a graph neural network trained on turbulence data.
In this video, Miles Cranmer (Princeton) discusses a method for converting a neural network into an analytic equation using a particular set of inductive biases. The technique relies on a sparsification of latent spaces in a deep neural network, followed by symbolic regression. In their paper, they demonstrate that they can recover physical laws for various simple and complex systems. For example, they discover gravity along with planetary masses from data; they learn a technique for doing cosmology with cosmic voids and dark matter halos; and they show how to extract the Euler equation from a graph neural network trained on turbulence data.
YouTube
Interpretable Deep Learning for New Physics Discovery
In this video, Miles Cranmer discusses a method for converting a neural network into an analytic equation using a particular set of inductive biases. The technique relies on a sparsification of latent spaces in a deep neural network, followed by symbolic…
Fresh picks from ArXiv
This week on ArXiv: SOTA for protein energy prediction, another solution to OGB-LSC challenge, and a new dataset based on Wikipedia 📚
If I forgot to mention your paper, please shoot me a message and I will update the post.
Applications
* X-GGM: Graph Generative Modeling for Out-of-Distribution Generalization in Visual Question Answering
* Structure-aware Interactive Graph Neural Networks for the Prediction of Protein-Ligand Binding Affinity KDD 2021
GNNs
* Local2Global: Scaling global representation learning on graphs via local training
* Ego-GNNs: Exploiting Ego Structures in Graph Neural Networks with William L. Hamilton
* Bridging the Gap between Spatial and Spectral Domains: A Theoretical Framework for Graph Neural Networks
* Large-scale graph representation learning with very deep GNNs and self-supervision with Petar Veličković
* Group Contrastive Self-Supervised Learning on Graphs with Shuiwang Ji
Datasets
* WikiGraphs: A Wikipedia Text - Knowledge Graph Paired Dataset with Oriol Vinyals
This week on ArXiv: SOTA for protein energy prediction, another solution to OGB-LSC challenge, and a new dataset based on Wikipedia 📚
If I forgot to mention your paper, please shoot me a message and I will update the post.
Applications
* X-GGM: Graph Generative Modeling for Out-of-Distribution Generalization in Visual Question Answering
* Structure-aware Interactive Graph Neural Networks for the Prediction of Protein-Ligand Binding Affinity KDD 2021
GNNs
* Local2Global: Scaling global representation learning on graphs via local training
* Ego-GNNs: Exploiting Ego Structures in Graph Neural Networks with William L. Hamilton
* Bridging the Gap between Spatial and Spectral Domains: A Theoretical Framework for Graph Neural Networks
* Large-scale graph representation learning with very deep GNNs and self-supervision with Petar Veličković
* Group Contrastive Self-Supervised Learning on Graphs with Shuiwang Ji
Datasets
* WikiGraphs: A Wikipedia Text - Knowledge Graph Paired Dataset with Oriol Vinyals
Graph Neural Networks User Group: July meeting
This month GNN user group talks about a new release of DGL and applications of GNNs. Please join this Thursday!
4:00 - 4:15 PM (PDT): DGL 0.7 release(Dr. Minjie Wang, Amazon)
4:15 - 4:30 PM (PDT): Storing Node Features in GPU memory to speedup billion-scale GNN training (Dr. Dominique LaSalle, NVIDIA)
4:30 - 5:00 PM (PDT): Locally Private Graph Neural Networks (Sina Sajadmanesh, Idiap Research Institute, Switzerland).
5:00 - 5:30 PM (PDT): Graph Embedding and Application in Meituan (Mengdi Zhang, Meituan).
This month GNN user group talks about a new release of DGL and applications of GNNs. Please join this Thursday!
4:00 - 4:15 PM (PDT): DGL 0.7 release(Dr. Minjie Wang, Amazon)
4:15 - 4:30 PM (PDT): Storing Node Features in GPU memory to speedup billion-scale GNN training (Dr. Dominique LaSalle, NVIDIA)
4:30 - 5:00 PM (PDT): Locally Private Graph Neural Networks (Sina Sajadmanesh, Idiap Research Institute, Switzerland).
5:00 - 5:30 PM (PDT): Graph Embedding and Application in Meituan (Mengdi Zhang, Meituan).
Eventbrite
Graph Neural Networks User Group
Header-Only C++ Library for Graph Representation and Algorithms
In case you need the speed of C++ for the well-known graph algorithms there is a nice repo that collects many of them.
In case you need the speed of C++ for the well-known graph algorithms there is a nice repo that collects many of them.
GitHub
GitHub - ZigRazor/CXXGraph: Header-Only C++ Library for Graph Representation and Algorithms
Header-Only C++ Library for Graph Representation and Algorithms - ZigRazor/CXXGraph
Graph Convolutional Neural Networks to Analyze Complex Carbohydrates
A blog post by Daniel Bojar about an application of GNN to analyzing glycan sequences and their proposed GNN architecture called SweetNet. There are other coverages of this work (here and here). The paper is here and the code is here.
A blog post by Daniel Bojar about an application of GNN to analyzing glycan sequences and their proposed GNN architecture called SweetNet. There are other coverages of this work (here and here). The paper is here and the code is here.
Graph Machine Learning research groups: Shuiwang Ji
I do a series of posts on the groups in graph research, previous post is here. The 32nd is Shuiwang Ji, a professor at Texas A&M University. His teams were awarded at OGB-LSC and AI Cures challenges. He also recently advised graph libraries such as MoleculeX and DIG.
Shuiwang Ji (~1982)
- Affiliation: Texas A&M University
- Education: Ph.D. at Arizona State University in 2008 (advisor: Jieping Ye)
- h-index 44
- Interests: GNNs, self-supervised learning, surveys, libraries.
- Awards: best papers at KDD, WWW, ACM Distinguished Member
I do a series of posts on the groups in graph research, previous post is here. The 32nd is Shuiwang Ji, a professor at Texas A&M University. His teams were awarded at OGB-LSC and AI Cures challenges. He also recently advised graph libraries such as MoleculeX and DIG.
Shuiwang Ji (~1982)
- Affiliation: Texas A&M University
- Education: Ph.D. at Arizona State University in 2008 (advisor: Jieping Ye)
- h-index 44
- Interests: GNNs, self-supervised learning, surveys, libraries.
- Awards: best papers at KDD, WWW, ACM Distinguished Member
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Andreas Krause
I do a series of posts on the groups in graph research, previous post is here. The 31st is Andreas Krause, a professor at ETH Zurich and an advisor for Stefanie Jegelka.
Andreas Krause (~1982)
- Affiliation:…
I do a series of posts on the groups in graph research, previous post is here. The 31st is Andreas Krause, a professor at ETH Zurich and an advisor for Stefanie Jegelka.
Andreas Krause (~1982)
- Affiliation:…
GNN User Group Videos
Videos from the last Thursday meeting of GNN user group are available now. This includes updates of DGL library, storing node feature for large graphs, and locally private GNNs.
Videos from the last Thursday meeting of GNN user group are available now. This includes updates of DGL library, storing node feature for large graphs, and locally private GNNs.
YouTube
Graph Neural Networks User Group Meeting on July 29th, 2021
July meeting agendas:
4:00 - 4:15 (PDT): DGL 0.7 Release
Abstract: The new 0.7 release of DGL brings many new updates to both system performance and usability. We will highlight some of the new features as well as how we work with our user community as…
4:00 - 4:15 (PDT): DGL 0.7 Release
Abstract: The new 0.7 release of DGL brings many new updates to both system performance and usability. We will highlight some of the new features as well as how we work with our user community as…
Fresh picks from ArXiv
This week on ArXiv: time series recovery, GNN challenge winning solutions, and benchmark for scene graph generation 🌳
If I forgot to mention your paper, please shoot me a message and I will update the post.
Applications
* Multivariate Time Series Imputation by Graph Neural Networks
* Temporal-Relational Hypergraph Tri-Attention Networks for Stock Trend Prediction
* Graph Constrained Data Representation Learning for Human Motion Segmentation
* The Graph Neural Networking Challenge: A Worldwide Competition for Education in AI/ML for Networks
* Image Scene Graph Generation (SGG) Benchmark
* Structack: Structure-based Adversarial Attacks on Graph Neural Networks
This week on ArXiv: time series recovery, GNN challenge winning solutions, and benchmark for scene graph generation 🌳
If I forgot to mention your paper, please shoot me a message and I will update the post.
Applications
* Multivariate Time Series Imputation by Graph Neural Networks
* Temporal-Relational Hypergraph Tri-Attention Networks for Stock Trend Prediction
* Graph Constrained Data Representation Learning for Human Motion Segmentation
* The Graph Neural Networking Challenge: A Worldwide Competition for Education in AI/ML for Networks
* Image Scene Graph Generation (SGG) Benchmark
* Structack: Structure-based Adversarial Attacks on Graph Neural Networks
Foundations of Graph Neural Networks Course
A new upcoming course by Zak Jost (you may remember his videos on GNNs) on the foundations of GNN which covers such topics as
- Neural Message Passing
- Fourier Transforms, Graph Wavelets and Spectral Convolutions
- Permutation Symmetries
- Representational capacity of GNNs
- Graph fundamentals like the Laplacian and graph isomorphism.
A new upcoming course by Zak Jost (you may remember his videos on GNNs) on the foundations of GNN which covers such topics as
- Neural Message Passing
- Fourier Transforms, Graph Wavelets and Spectral Convolutions
- Permutation Symmetries
- Representational capacity of GNNs
- Graph fundamentals like the Laplacian and graph isomorphism.
Telegram
Graph Machine Learning
GML YouTube Videos
I was pleasantly surprised to see there is YouTube playlist by Zak Jost that covers some aspects of GNNs, including an interview with DeepMind authors for using GNNs for physics.
I was pleasantly surprised to see there is YouTube playlist by Zak Jost that covers some aspects of GNNs, including an interview with DeepMind authors for using GNNs for physics.
Graph Neural Networks: Algorithms and Applications
A great presentation by Jian Tang about GNN basics, training many layers, self-supervised learning and statistical relational learning.
A great presentation by Jian Tang about GNN basics, training many layers, self-supervised learning and statistical relational learning.
Knowledge Graphs in Natural Language Processing @ ACL 2021
A regular update from Michael Galkin on the SOTA applications of KG in the world of words:
Neural Databases & Retrieval
KG-augmented Language Models
KG Embeddings & Link Prediction
Entity Alignment
KG Construction, Entity Linking, Relation Extraction
KGQA: Temporal, Conversational, and AMR.
A regular update from Michael Galkin on the SOTA applications of KG in the world of words:
Neural Databases & Retrieval
KG-augmented Language Models
KG Embeddings & Link Prediction
Entity Alignment
KG Construction, Entity Linking, Relation Extraction
KGQA: Temporal, Conversational, and AMR.
Medium
Knowledge Graphs in Natural Language Processing @ ACL 2021
Your guide to the KG-related NLP research, ACL edition