Summary of Heteroswitch: Characterizing and Taming System-induced Data Heterogeneity in Federated Learning, by Gyudong Kim et al.
HeteroSwitch: Characterizing and Taming System-Induced Data Heterogeneity in Federated Learning
by Gyudong Kim, Mehdi Ghasemi, Soroush Heidari, Seungryong Kim, Young Geun Kim, Sarma Vrudhula, Carole-Jean Wu
First submitted to arxiv on: 7 Mar 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Distributed, Parallel, and Cluster Computing (cs.DC)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary A Federated Learning approach trains deep learning models collaboratively across user-end devices, protecting user privacy by retaining raw data on-device. The paper investigates how system-induced data heterogeneity affects FL model performance. It collects a dataset using heterogeneous devices with variations in vendors and performance tiers, demonstrating that this type of heterogeneity negatively impacts accuracy and worsens fairness and domain generalization problems. To address these challenges, the authors propose HeteroSwitch, an adaptive algorithm that uses generalization techniques depending on the level of bias caused by varying hardware and software configurations. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary Federated Learning helps train AI models without sharing data. This paper looks at how different devices with different hardware and software can affect model performance. It finds that this difference makes it harder for the model to be accurate, fair, and work well in different situations. To fix these problems, the researchers suggest a new way of training called HeteroSwitch, which adapts to the differences between devices. |
Keywords
* Artificial intelligence * Deep learning * Domain generalization * Federated learning * Generalization