Summary of Tinygraph: Joint Feature and Node Condensation For Graph Neural Networks, by Yezi Liu et al.
TinyGraph: Joint Feature and Node Condensation for Graph Neural Networks
by Yezi Liu, Yanning Shen
First submitted to arxiv on: 10 Jul 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Artificial Intelligence (cs.AI)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The proposed framework, TinyGraph, tackles the challenge of training graph neural networks (GNNs) on large-scale graphs by simultaneously condensing both nodes and features in the graph. This approach addresses the limitations of existing graph condensation methods that only reduce the number of nodes, resulting in cumbersome condensed graph data. By casting the problem as matching the gradients of GNN weights trained on the condensed graph with those obtained from training over the original graph, TinyGraph achieves feature condensation through a trainable function. The framework demonstrates effectiveness by retaining high test accuracy on the Cora and Citeseer datasets while significantly reducing the number of nodes and features. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary TinyGraph is a new way to make it easier to train computer models that work with big graphs. Right now, training these models can be very slow because they have to process lots of information about each node in the graph. To speed things up, researchers usually reduce the number of nodes, but this still leaves them with a lot of data to work with. TinyGraph is different because it condenses both the nodes and the features (the characteristics) of each node at the same time. This helps keep important information from the original graph while making it faster to train the model. The results show that models trained with TinyGraph can be just as accurate as those trained on the full data, but much faster. |
Keywords
* Artificial intelligence * Gnn