Loading Now

Summary of Scalable Graph Compressed Convolutions, by Junshu Sun et al.


Scalable Graph Compressed Convolutions

by Junshu Sun, Shuhui Wang, Chenxue Yang, Qingming Huang

First submitted to arxiv on: 26 Jul 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper addresses two fundamental challenges in designing effective graph neural networks (GNNs): determining optimal message-passing pathways and designing local aggregators. Existing methods are limited by information loss on input features or fail to extract multi-scale features. The authors propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution, enabling flexible generalization of Euclidean convolution to graphs. They also introduce the Compressed Convolution Network (CoCN) for hierarchical graph representation learning, which can be trained end-to-end with compressed convolution and incorporates successful practices from Euclidean convolution such as residual connection and inception mechanism. CoCN outperforms competitive GNN baselines on both node-level and graph-level benchmarks.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper helps us learn better about graphs by creating a new way to use an old idea, called Euclidean convolution, for graph neural networks (GNNs). The problem is that graphs are not like numbers in a row, so we need to adjust the input to make it work. The authors come up with a clever way to do this and then create a new model called CoCN that uses these adjusted inputs to learn more about graphs. This model can be trained all at once and uses some tricks from regular convolutional networks like adding extra information or looking ahead. It works better than other GNNs on lots of different tasks.

Keywords

* Artificial intelligence  * Generalization  * Gnn  * Representation learning