Summary of Slim: Let Llm Learn More and Forget Less with Soft Lora and Identity Mixture, by Jiayi Han et al.
SLIM: Let LLM Learn More and Forget Less with Soft LoRA and Identity Mixture
by Jiayi Han, Liang Du, Hongwei Du, Xiangguo Zhou, Yiwen Wu, Weibo Zheng, Donghong Han
First submitted to arxiv on: 10 Oct 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Computation and Language (cs.CL)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The proposed Soft LoRA and Identity Mixture (SLIM) framework effectively balances training budget, downstream performance, and general capabilities of large language models (LLMs). This novel MoE approach introduces dynamic routing between LoRA adapters and skipping connections to mitigate catastrophic forgetting. By combining weight-yielding with sliding clustering for out-of-domain distinguishability, SLIM achieves state-of-the-art results in mitigating forgetting while maintaining comparable performance on downstream tasks. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary The paper proposes a way to make large language models (LLMs) work better on specific tasks without losing their general abilities. The method uses a special type of mixing and routing between different parts of the model, allowing it to learn from new data without forgetting what it already knows. This is important because LLMs are very good at understanding language, but they often struggle to apply this understanding to specific tasks. |
Keywords
» Artificial intelligence » Clustering » Lora