Loading Now

Summary of Sequential Federated Learning in Hierarchical Architecture on Non-iid Datasets, by Xingrun Yan et al.


Sequential Federated Learning in Hierarchical Architecture on Non-IID Datasets

by Xingrun Yan, Shiyuan Zuo, Rongfei Fan, Han Hu, Li Shen, Puning Zhao, Yong Luo

First submitted to arxiv on: 19 Aug 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
A novel federated learning algorithm, called Fed-CHS, is introduced to reduce communication overhead in hierarchical federated learning systems. This approach combines sequential FL with HFL, eliminating the need for a central parameter server and enabling model training through local updates between edge servers. The proposed algorithm achieves comparable convergence performance with existing methods under various data heterogeneity setups. Experimental results show that Fed-CHS outperforms baseline methods in terms of both communication overhead saving and test accuracy.
Low GrooveSquid.com (original content) Low Difficulty Summary
Federated learning is a way for multiple devices to learn from each other without sharing their data. Usually, this process happens through a central server, but it can be slow because all the information needs to be sent back and forth. To make it faster, researchers have come up with a new idea that involves edge servers, which are like smaller helpers that can do some of the work. This new method is called sequential federated learning, or SFL for short. It’s the first time this technique has been used in hierarchical federated learning, and it seems to work really well. The results show that it not only saves time but also gets better accuracy.

Keywords

» Artificial intelligence  » Federated learning