Summary of Fit-gnn: Faster Inference Time For Gnns Using Coarsening, by Shubhajit Roy et al.
FIT-GNN: Faster Inference Time for GNNs Using Coarsening
by Shubhajit Roy, Hrriday Ruparel, Kishan Ved, Anirban Dasgupta
First submitted to arxiv on: 19 Oct 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Machine Learning (stat.ML)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The novel approach presented in this paper aims to improve the scalability of Graph Neural Networks (GNNs) by reducing computational burdens during both training and inference phases. The proposed method employs two different coarsening-based techniques: Extra-Nodes and Cluster-Nodes, which demonstrate competitive performance compared to traditional GNNs on graph classification and regression tasks. The approach achieves single-node inference times that are orders of magnitude faster while significantly reducing memory consumption, making it feasible for low-resource devices where traditional methods struggle. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary This paper introduces a new way to make Graph Neural Networks (GNNs) work better with large amounts of data. Normally, GNNs take a long time to process big graphs because they need to look at every single node and connection. The authors came up with two ways to speed this up: Extra-Nodes and Cluster-Nodes. These methods are tested on several datasets and show that they can do tasks like classifying or regressing nodes and graphs just as well as regular GNNs, but much faster. This is important because it could let us use GNNs in situations where we didn’t have the resources to before. |
Keywords
» Artificial intelligence » Classification » Inference » Regression