Summary of Gnnavigator: Towards Adaptive Training Of Graph Neural Networks Via Automatic Guideline Exploration, by Tong Qiao and Jianlei Yang and Yingjie Qi and Ao Zhou and Chen Bai and Bei Yu and Weisheng Zhao and Chunming Hu
GNNavigator: Towards Adaptive Training of Graph Neural Networks via Automatic Guideline Exploration
by Tong Qiao, Jianlei Yang, Yingjie Qi, Ao Zhou, Chen Bai, Bei Yu, Weisheng Zhao, Chunming Hu
First submitted to arxiv on: 15 Apr 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Artificial Intelligence (cs.AI)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The paper proposes GNNavigator, a framework for optimizing the training of Graph Neural Networks (GNNs). The goal is to balance the runtime cost, memory consumption, and attainable accuracy of GNNs across various applications. To achieve this, the authors develop a unified software-hardware co-abstraction, a training performance model, and a design space exploration solution. Experimental results show that GNNavigator can achieve up to 3.1x speedup and 44.9% peak memory reduction while maintaining comparable accuracy to state-of-the-art approaches. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary GNNs are special kinds of artificial intelligence networks that work well for certain types of data. However, making these networks work efficiently is a big challenge. The authors of this paper created a new way to train GNNs called GNNavigator. This framework helps make the training process faster and use less memory while still getting good results. The paper shows that using GNNavigator can speed up the training by up to 3 times and reduce memory usage by almost half, without sacrificing accuracy. |