Summary of Masked Random Noise For Communication Efficient Federated Learning, by Shiwei Li and Yingyi Cheng and Haozhao Wang and Xing Tang and Shijie Xu and Weihong Luo and Yuhua Li and Dugang Liu and Xiuqiang He and Ruixuan Li
Masked Random Noise for Communication Efficient Federated Learning
by Shiwei Li, Yingyi Cheng, Haozhao Wang, Xing Tang, Shijie Xu, Weihong Luo, Yuhua Li, Dugang Liu, Xiuqiang He, Ruixuan Li
First submitted to arxiv on: 6 Aug 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Distributed, Parallel, and Cluster Computing (cs.DC)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The proposed Federated Masked Random Noise (FedMRN) framework for distributed training in federated learning enhances communication efficiency by enabling clients to learn 1-bit masks for model parameters and apply masked random noise. This approach reduces the amount of information transmitted between clients and the server, while maintaining convergence guarantees under both strongly convex and non-convex assumptions. FedMRN outperforms relevant baselines in terms of convergence speed and test accuracy on four popular datasets. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary Federated learning is a way to train models without sharing sensitive data. But it can be slow because clients need to send lots of information back and forth. To solve this problem, researchers propose a new method called FedMRN. It helps clients decide how much noise to add to their model updates before sending them to the server. This makes the training process faster and more accurate. |
Keywords
» Artificial intelligence » Federated learning