Summary of Rethinking Llm Language Adaptation: a Case Study on Chinese Mixtral, by Yiming Cui et al.
Rethinking LLM Language Adaptation: A Case Study on Chinese Mixtralby Yiming Cui, Xin YaoFirst submitted…
Rethinking LLM Language Adaptation: A Case Study on Chinese Mixtralby Yiming Cui, Xin YaoFirst submitted…
Hypertext Entity Extraction in Webpageby Yifei Yang, Tianqiao Liu, Bo Shao, Hai Zhao, Linjun Shou,…
An Effective Mixture-Of-Experts Approach For Code-Switching Speech Recognition Leveraging Encoder Disentanglementby Tzu-Ting Yang, Hsin-Wei Wang,…
LLMBind: A Unified Modality-Task Integration Frameworkby Bin Zhu, Munan Ning, Peng Jin, Bin Lin, Jinfa…
MoRAL: MoE Augmented LoRA for LLMs’ Lifelong Learningby Shu Yang, Muhammad Asif Ali, Cheng-Long Wang,…
Higher Layers Need More LoRA Expertsby Chongyang Gao, Kezhen Chen, Jinmeng Rao, Baochen Sun, Ruibo…
Parameter-Efficient Sparsity Crafting from Dense to Mixture-of-Experts for Instruction Tuning on General Tasksby Haoyuan Wu,…
Multimodal Variational Autoencoder: a Barycentric Viewby Peijie Qiu, Wenhui Zhu, Sayantan Kumar, Xiwen Chen, Xiaotong…
Graph Mixture of Experts and Memory-augmented Routers for Multivariate Time Series Anomaly Detectionby Xiaoyu Huang,…
Theory of Mixture-of-Experts for Mobile Edge Computingby Hongbo Li, Lingjie DuanFirst submitted to arxiv on:…