Summary of Text-enhanced Data-free Approach For Federated Class-incremental Learning, by Minh-tuan Tran et al.
Text-Enhanced Data-free Approach for Federated Class-Incremental Learning
by Minh-Tuan Tran, Trung Le, Xuan-May Le, Mehrtash Harandi, Dinh Phung
First submitted to arxiv on: 21 Mar 2024
Categories
- Main: Computer Vision and Pattern Recognition (cs.CV)
- Secondary: Computation and Language (cs.CL); Machine Learning (cs.LG)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary In this paper, the authors tackle Federated Class-Incremental Learning (FCIL), a crucial issue in federated learning where new classes are dynamically added. They introduce LANDER, an approach that utilizes label text embeddings (LTE) produced by pretrained language models to address catastrophic forgetting and data privacy problems. During model training, LTE anchors constrain feature embeddings of corresponding samples around them, enriching the surrounding area with meaningful information. In the DFKT phase, LANDER synthesizes more meaningful samples using these LTE anchors, effectively addressing the forgetting problem. The authors also introduce Bounding Loss to encourage flexible sample embeddings within a defined radius, preserving natural differences and mitigating embedding overlap in heterogeneous federated settings. Extensive experiments on CIFAR100, Tiny-ImageNet, and ImageNet demonstrate that LANDER outperforms previous methods and achieves state-of-the-art performance in FCIL. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary In this paper, researchers developed a new way to learn from data without sharing it with others. This is important because it helps protect people’s privacy. They created an approach called LANDER that uses special text embeddings to help the learning process. LANDER does two things: first, it keeps track of old classes and makes sure the model doesn’t forget them; second, it creates new samples that are similar to the ones the model has seen before. This helps the model learn quickly and accurately from small amounts of data. The authors tested LANDER on several datasets and found that it worked better than other methods. |
Keywords
* Artificial intelligence * Embedding * Federated learning