Loading Now

Summary of Towards Lightweight Graph Neural Network Search with Curriculum Graph Sparsification, by Beini Xie et al.


Towards Lightweight Graph Neural Network Search with Curriculum Graph Sparsification

by Beini Xie, Heng Chang, Ziwei Zhang, Zeyang Zhang, Simin Wu, Xin Wang, Yuan Meng, Wenwu Zhu

First submitted to arxiv on: 24 Jun 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI); Social and Information Networks (cs.SI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The proposed Lightweight Graph Neural Architecture Search with Graph SparsIfication and Network Pruning (GASSIP) method enables efficient search for optimal lightweight Graph Neural Networks (GNNs) in resource-constraint scenarios. By designing a joint graph data and architecture mechanism, GASSIP identifies important sub-architectures via valuable graph data, and iteratively optimizes two modules using differentiable masks to achieve the best results. The method is tested on five benchmarks, demonstrating its effectiveness by achieving node classification performance comparable to or even better than searched GNNs with half or fewer model parameters and a sparser graph.
Low GrooveSquid.com (original content) Low Difficulty Summary
GASSIP is a new way to find the best Graph Neural Networks (GNNs) for tasks that need less computing power. This is important because many devices, like smartphones or smart home devices, have limited resources. The method uses two main ideas: pruning architecture search and curriculum graph data sparsification. Pruning means removing unnecessary parts of a model to make it smaller and faster. Curriculum learning is a way to teach a model gradually by making the task easier at first and harder later. GASSIP combines these ideas to find the best lightweight GNNs that can work well with limited resources.

Keywords

* Artificial intelligence  * Classification  * Curriculum learning  * Pruning