Loading Now

Summary of Recurrent Early Exits For Federated Learning with Heterogeneous Clients, by Royson Lee et al.


Recurrent Early Exits for Federated Learning with Heterogeneous Clients

by Royson Lee, Javier Fernandez-Marques, Shell Xu Hu, Da Li, Stefanos Laskaridis, Łukasz Dudziak, Timothy Hospedales, Ferenc Huszár, Nicholas D. Lane

First submitted to arxiv on: 23 May 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Computer Vision and Pattern Recognition (cs.CV); Distributed, Parallel, and Cluster Computing (cs.DC)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
Federated learning (FL) has enabled distributed learning, but accommodating clients with varying hardware capacities remains a challenge. Recent state-of-the-art approaches leverage early exits, yet they fall short of mitigating the challenges of joint learning multiple exit classifiers. In this work, we propose ReeFL, a recurrent early exit approach that fuses features from different sub-models into a single shared classifier using a transformer-based early-exit module. This module better exploits multi-layer feature representations for task-specific prediction and modulates the feature representation of the backbone model. We also present a per-client self-distillation approach where the best sub-model is automatically selected as the teacher of other sub-models. Our experiments on image and speech classification benchmarks demonstrate ReeFL’s effectiveness over previous works, achieving state-of-the-art results.
Low GrooveSquid.com (original content) Low Difficulty Summary
Imagine computers working together to learn from data without sharing personal information. That’s federated learning! One problem is that different computers have different power levels. Some are strong, while others are weak. This makes it hard for them to work together. In this research, we came up with a new way called ReeFL (Recurrent Early Exit) that helps these computers learn from each other more effectively. We use a special tool that combines the strengths of each computer and lets them share knowledge. Our results show that ReeFL works better than previous methods!

Keywords

» Artificial intelligence  » Classification  » Distillation  » Federated learning  » Transformer