Summary of Towards More Suitable Personalization in Federated Learning Via Decentralized Partial Model Training, by Yifan Shi et al.
Towards More Suitable Personalization in Federated Learning via Decentralized Partial Model Training
by Yifan Shi, Yingqi Liu, Yan Sun, Zihao Lin, Li Shen, Xueqian Wang, Dacheng Tao
First submitted to arxiv on: 24 May 2023
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Distributed, Parallel, and Cluster Computing (cs.DC); Optimization and Control (math.OC)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary This paper proposes a novel approach to personalized federated learning (PFL) that addresses the challenges of data heterogeneity and communication burdens in real-world applications. The authors introduce a decentralized framework called DFedAlt, which trains partially personalized models through peer-to-peer updates of shared and personal parameters. To further promote aggregation of shared parameters, the paper proposes an optimizer called DFedSalt, which adds perturbation to the gradient direction to overcome inconsistencies across clients. Convergence analysis is provided for both algorithms in a non-convex setting. Experimental results on real-world datasets demonstrate that decentralized training can achieve state-of-the-art accuracy for partial personalization, and that the proposed framework outperforms baselines in terms of model performance. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary This paper helps make personalized learning more fair and efficient by allowing different devices to learn together without sharing all their data. It proposes a new way to do this called decentralized partial model training, which allows each device to update its own model while also helping the other devices’ models. The authors tested this method on real-world datasets and found that it performed better than previous methods for certain types of learning tasks. This could be important for things like personalized health recommendations or customized education. |
Keywords
* Artificial intelligence * Federated learning