Summary of Uni-moe: Scaling Unified Multimodal Llms with Mixture Of Experts, by Yunxin Li et al.
Uni-MoE: Scaling Unified Multimodal LLMs with Mixture of Expertsby Yunxin Li, Shenyuan Jiang, Baotian Hu,…
Uni-MoE: Scaling Unified Multimodal LLMs with Mixture of Expertsby Yunxin Li, Shenyuan Jiang, Baotian Hu,…
Many Hands Make Light Work: Task-Oriented Dialogue System with Module-Based Mixture-of-Expertsby Ruolin Su, Biing-Hwang JuangFirst…
SUTRA: Scalable Multilingual Language Model Architectureby Abhijit Bendale, Michael Sapienza, Steven Ripplinger, Simon Gibbs, Jaewon…
A Mixture-of-Experts Approach to Few-Shot Task Transfer in Open-Ended Text Worldsby Christopher Z. Cui, Xiangyu…
Mix of Experts Language Model for Named Entity Recognitionby Xinwei Chen, Kun Li, Tianyou Song,…
MMoE: Robust Spoiler Detection with Multi-modal Information and Domain-aware Mixture-of-Expertsby Zinan Zeng, Sen Ye, Zijian…
ConstitutionalExperts: Training a Mixture of Principle-based Promptsby Savvas Petridis, Ben Wedin, Ann Yuan, James Wexler,…
Hypertext Entity Extraction in Webpageby Yifei Yang, Tianqiao Liu, Bo Shao, Hai Zhao, Linjun Shou,…
Rethinking LLM Language Adaptation: A Case Study on Chinese Mixtralby Yiming Cui, Xin YaoFirst submitted…
An Effective Mixture-Of-Experts Approach For Code-Switching Speech Recognition Leveraging Encoder Disentanglementby Tzu-Ting Yang, Hsin-Wei Wang,…