Summary of Decoupling General and Personalized Knowledge in Federated Learning Via Additive and Low-rank Decomposition, by Xinghao Wu et al.
Decoupling General and Personalized Knowledge in Federated Learning via Additive and Low-Rank Decomposition
by Xinghao Wu, Xuefeng Liu, Jianwei Niu, Haolin Wang, Shaojie Tang, Guogang Zhu, Hao Su
First submitted to arxiv on: 28 Jun 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Artificial Intelligence (cs.AI)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary A novel Personalized Federated Learning (PFL) paradigm, called FedDecomp, is introduced to address data heterogeneity in collaborative learning. Unlike existing PFL methods that adopt parameter partitioning approaches, FedDecomp employs additive decomposition to effectively decouple shared and client-specific knowledge. The proposed method decomposes each model parameter into a shared component and a personalized one, allowing for better separation of the two types of knowledge. Additionally, FedDecomp uses low-rank matrix factorization to reduce the model capacity required for retaining local knowledge, and proposes an alternating training strategy to improve performance. Experimental results demonstrate that FedDecomp outperforms state-of-the-art methods up to 4.9% across multiple datasets with varying degrees of data heterogeneity. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary FedDecomp is a new way to help different devices learn together even if they have different types of data. Right now, when devices share what they’ve learned, some information might not be useful for everyone else. To fix this, FedDecomp breaks down each piece of information into two parts: something that’s the same for all devices and something that’s unique to each device. This makes it easier for devices to learn from each other without getting confused by extra information. |
Keywords
» Artificial intelligence » Federated learning