Loading Now

Summary of Fedgt: Federated Node Classification with Scalable Graph Transformer, by Zaixi Zhang et al.


FedGT: Federated Node Classification with Scalable Graph Transformer

by Zaixi Zhang, Qingyong Hu, Yang Yu, Weibo Gao, Qi Liu

First submitted to arxiv on: 26 Jan 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
In this paper, researchers tackle the issue of training Graph Neural Networks (GNNs) on large relational data graphs that are distributed across multiple local systems. Existing methods, such as subgraph federated learning, have limitations including the lack of links between local subgraphs and overlooking subgraph heterogeneity. To address these challenges, the authors propose a scalable Federated Graph Transformer (FedGT) that combines a hybrid attention scheme with an online clustering algorithm to capture local and global information, while also considering data heterogeneity and privacy.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper is about finding new ways to train computer models on big networks of connected things. Right now, these networks are too big for one computer to handle alone, so we have to split them up into smaller pieces and work on each piece separately. This can be tricky because the different pieces might not be connected in a way that helps the model learn. The authors of this paper came up with a new way to solve this problem by using a combination of local and global information, which makes the model better at understanding relationships between things.

Keywords

* Artificial intelligence  * Attention  * Clustering  * Federated learning  * Transformer