Summary of Federated Graph Condensation with Information Bottleneck Principles, by Bo Yan et al.
Federated Graph Condensation with Information Bottleneck Principles
by Bo Yan, Sihao He, Cheng Yang, Shang Liu, Yang Cao, Chuan Shi
First submitted to arxiv on: 7 May 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Artificial Intelligence (cs.AI); Cryptography and Security (cs.CR); Distributed, Parallel, and Cluster Computing (cs.DC)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary This paper proposes a novel approach to graph condensation for graph neural networks called Federated Graph Condensation (FGC). The FGC framework decouples the typical gradient matching process into client-side gradient calculation and server-side gradient matching, allowing for the integration of knowledge from multiple clients’ subgraphs into one smaller condensed graph. However, the authors found that under a federated setting, the condensed graph can leak data membership privacy, making it vulnerable to membership inference attacks. To address this issue, they incorporate information bottleneck principles into FGC, which pre-trains node features locally and uses them during federated training. The proposed framework consistently protects membership privacy while achieving comparable or superior performance to existing centralized GC and federated graph learning methods. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary The paper proposes a new way to make big graphs smaller for use in machine learning models called Federated Graph Condensation (FGC). FGC makes it possible to combine information from many small parts of the graph together into one smaller version. However, this process can be tricky and might reveal secrets about who has certain data. To fix this problem, the authors came up with a new way to train the model that keeps this secret safe. The new approach is called information bottleneck principle. It works by training the model in a special way before using it for the main task. This makes sure that the model can’t be used to figure out who has what data. |
Keywords
» Artificial intelligence » Inference » Machine learning