Loading Now

Summary of Embracing Federated Learning: Enabling Weak Client Participation Via Partial Model Training, by Sunwoo Lee et al.


Embracing Federated Learning: Enabling Weak Client Participation via Partial Model Training

by Sunwoo Lee, Tuo Zhang, Saurav Prakash, Yue Niu, Salman Avestimehr

First submitted to arxiv on: 21 Jun 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Distributed, Parallel, and Cluster Computing (cs.DC)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The proposed Federated Learning framework, EmbracingFL, enables weak clients to participate in distributed training by utilizing a partial model training method. This approach allows each client to train as many consecutive output-side layers as their system resources allow. The study demonstrates that this method encourages similar data representations across clients, improving FL efficiency and guaranteeing convergence to a neighbor of stationary points for non-convex and smooth problems. EmbracingFL is evaluated under various settings with mixed-client scenarios, datasets (CIFAR-10, FEMNIST, and IMDB), and models (ResNet20, CNN, and LSTM). The results show that EmbracingFL achieves high accuracy similar to having all clients be strong, outperforming state-of-the-art width reduction methods.
Low GrooveSquid.com (original content) Low Difficulty Summary
EmbracingFL is a new way for computers to work together to learn from data without sharing their individual information. Some of these computers might not have enough memory or processing power, so the system was designed to let them still contribute to the learning process. The method works by letting each computer train as many parts of the model as it can handle. This helps ensure that all the computers are working together to create a similar understanding of the data, which makes the whole system more efficient. The researchers tested EmbracingFL with different types of datasets and models, and found that it performed just as well as if all the computers were strong.

Keywords

* Artificial intelligence  * Cnn  * Federated learning  * Lstm