Loading Now

Summary of Fedmap: Iterative Magnitude-based Pruning For Communication-efficient Federated Learning, by Alexander Herzog and Robbie Southam and Ioannis Mavromatis and Aftab Khan


FedMap: Iterative Magnitude-Based Pruning for Communication-Efficient Federated Learning

by Alexander Herzog, Robbie Southam, Ioannis Mavromatis, Aftab Khan

First submitted to arxiv on: 27 Jun 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper introduces FedMap, a novel method that enhances the communication efficiency of Federated Learning (FL) deployments by collaboratively learning an increasingly sparse global model through iterative, unstructured pruning. Unlike other methods reported in the literature, FedMap trains a global model from scratch, making it ideal for privacy-critical use cases such as medical and finance domains where pre-training data is limited. The paper adapts iterative magnitude-based pruning to the FL setting, ensuring all clients prune and refine the same subset of global model parameters, reducing communication overhead and gradually shrinking the global model size. FedMap’s iterative nature avoids parameter reactivation issues seen in prior work, resulting in stable performance. The paper provides an extensive evaluation across diverse settings, datasets, model architectures, and hyperparameters, assessing performance in both IID and non-IID environments.
Low GrooveSquid.com (original content) Low Difficulty Summary
FedMap is a new way to make machine learning models more efficient when working with lots of devices that don’t have much power or storage space. This approach helps keep data private while still training the model. It’s like taking a big picture and making it smaller, but keeping all the important details. The researchers tested this method on different kinds of problems and showed that it works well even when the data isn’t perfectly organized.

Keywords

* Artificial intelligence  * Federated learning  * Machine learning  * Pruning