Summary of Overcoming Data and Model Heterogeneities in Decentralized Federated Learning Via Synthetic Anchors, by Chun-yin Huang and Kartik Srinivas and Xin Zhang and Xiaoxiao Li
Overcoming Data and Model Heterogeneities in Decentralized Federated Learning via Synthetic Anchors
by Chun-Yin Huang, Kartik Srinivas, Xin Zhang, Xiaoxiao Li
First submitted to arxiv on: 19 May 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Artificial Intelligence (cs.AI)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The proposed Decentralized Federated Learning (FL) technique, dubbed as DeSA, addresses the challenge of managing data and model heterogeneity among clients in decentralized FL. By introducing Synthetic Anchors based on domain adaptation and Knowledge Distillation (KD), DeSA facilitates mutual knowledge transfer and enhances both inter- and intra-domain accuracy of each client. The technique includes two effective regularization terms: REG loss that regularizes the distribution of the client’s latent embedding with anchors, and KD loss that enables clients to learn from others. Experimental results on diverse client data distributions demonstrate the effectiveness of DeSA in enhancing model generalizability. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary Decentralized Federated Learning (FL) is a way for devices to work together while keeping their data private. Usually, each device trains its own local model, which can make it harder to share knowledge between them. The new technique, called DeSA, helps solve this problem by creating “anchors” that represent the global model. This allows devices to learn from each other and improve their models. DeSA also includes two ways to keep devices’ models on track: one that makes sure their embeddings (a way of representing data) are similar to the anchors, and another that lets them learn from others. The results show that DeSA is effective in making sure devices can use their local models accurately. |
Keywords
» Artificial intelligence » Domain adaptation » Embedding » Federated learning » Knowledge distillation » Regularization