Loading Now

Summary of Boosting Federated Learning with Fedentopt: Mitigating Label Skew by Entropy-based Client Selection, By Andreas Lutz et al.


Boosting Federated Learning with FedEntOpt: Mitigating Label Skew by Entropy-Based Client Selection

by Andreas Lutz, Gabriele Steidl, Karsten Müller, Wojciech Samek

First submitted to arxiv on: 2 Nov 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI); Distributed, Parallel, and Cluster Computing (cs.DC)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper proposes a novel federated learning method called FedEntOpt that addresses label distribution skew in decentralized deep learning. The method maximizes the entropy of the global label distribution among selected clients in each round, ensuring the aggregated model parameters are exposed to data from all available labels. This improves the accuracy of the global model. The authors demonstrate the effectiveness of FedEntOpt by outperforming state-of-the-art algorithms on multiple benchmark datasets, achieving up to 6% improvement in classification accuracy regardless of model size. Additionally, FedEntOpt exhibits robust performance in scenarios with low participation rates and client dropout, achieving increases in classification accuracy of over 30%. The proposed method can be combined with existing algorithms to enhance their performance by up to 40%.
Low GrooveSquid.com (original content) Low Difficulty Summary
Federated learning is a way for devices to work together on deep learning projects without sharing sensitive data. This helps keep personal information private. But, when the data is different from one device to another, it can cause problems. The authors of this paper developed a new method called FedEntOpt that solves this problem. It makes sure that the model gets trained with data from all available labels, making it more accurate. The results show that FedEntOpt works better than other methods, even when some devices don’t participate or drop out. This is important for real-world applications where not all devices will be involved.

Keywords

» Artificial intelligence  » Classification  » Deep learning  » Dropout  » Federated learning