Loading Now

Summary of Revisiting Ensembling in One-shot Federated Learning, by Youssef Allouah et al.


Revisiting Ensembling in One-Shot Federated Learning

by Youssef Allouah, Akash Dhasade, Rachid Guerraoui, Nirupam Gupta, Anne-Marie Kermarrec, Rafael Pinot, Rafael Pires, Rishi Sharma

First submitted to arxiv on: 11 Nov 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Distributed, Parallel, and Cluster Computing (cs.DC)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
Federated learning (FL) offers a promising approach to training machine learning models without sharing raw data. However, traditional FL algorithms are iterative, inducing significant communication costs. One-shot federated learning (OFL) mitigates this issue by exchanging models in a single round, reducing communication costs. OFL’s performance gap in terms of accuracy with respect to FL is notable, especially under high data heterogeneity. To bridge this gap, we introduce FENS, a novel federated ensembling scheme that approaches FL’s accuracy while maintaining OFL’s efficiency. FENS involves two phases: local model training and sending them to the server (similar to OFL), followed by collaborative training of a lightweight prediction aggregator using FL. We demonstrate FENS’ effectiveness through exhaustive experiments on various datasets and heterogeneity levels. For instance, on the CIFAR-10 dataset with heterogeneous distribution, FENS achieves up to 26.9% higher accuracy than state-of-the-art (SOTA) OFL, while being only 3.1% lower than FL. Notably, FENS incurs at most 4.3x more communication than OFL and is at least 10.9x less communication-intensive than FL.
Low GrooveSquid.com (original content) Low Difficulty Summary
Federated learning helps train machine learning models without sharing data. A problem with traditional federated learning is that it takes many exchanges of information between devices, which uses a lot of bandwidth. One-shot federated learning tries to solve this by only having one exchange. However, this approach isn’t as good at getting accurate results, especially when the data is very different across devices. To fix this, we created FENS, a new way of doing federated ensembling that gets close to traditional federated learning’s accuracy while still being efficient. FENS has two steps: first, devices train models locally and send them to the server, like one-shot federated learning. Then, devices work together to train a lightweight prediction model using traditional federated learning. We tested FENS on many datasets and showed that it gets good results.

Keywords

» Artificial intelligence  » Federated learning  » Machine learning  » One shot