Summary of Exploring Vacant Classes in Label-skewed Federated Learning, by Kuangpu Guo et al.
Exploring Vacant Classes in Label-Skewed Federated Learning
by Kuangpu Guo, Yuhe Ding, Jian Liang, Ran He, Zilei Wang, Tieniu Tan
First submitted to arxiv on: 4 Jan 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Computer Vision and Pattern Recognition (cs.CV)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The proposed FedVLS approach addresses label skew challenges in federated learning by integrating vacant-class distillation and logit suppression. Local training on each client uses knowledge distillation to retain essential information related to absent classes from the global model, while logit suppression directly penalizes non-label class predictions to reduce misclassifications towards majority classes. This method demonstrates superior performance compared to previous state-of-the-art methods across diverse datasets with varying label skews. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary FedVLS is a new way to help computers learn together without sharing all their data. When different devices have different types of data, it can be hard for them to agree on what’s what. FedVLS helps by making sure the devices remember important details about things they don’t have much information about. It also stops the devices from guessing incorrectly based on what they do have information about. This makes the whole process work better and more accurately. |
Keywords
* Artificial intelligence * Distillation * Federated learning * Knowledge distillation