Summary of Llmbind: a Unified Modality-task Integration Framework, by Bin Zhu et al.
LLMBind: A Unified Modality-Task Integration Frameworkby Bin Zhu, Munan Ning, Peng Jin, Bin Lin, Jinfa…
LLMBind: A Unified Modality-Task Integration Frameworkby Bin Zhu, Munan Ning, Peng Jin, Bin Lin, Jinfa…
MoRAL: MoE Augmented LoRA for LLMs’ Lifelong Learningby Shu Yang, Muhammad Asif Ali, Cheng-Long Wang,…
Higher Layers Need More LoRA Expertsby Chongyang Gao, Kezhen Chen, Jinmeng Rao, Baochen Sun, Ruibo…
Parameter-Efficient Sparsity Crafting from Dense to Mixture-of-Experts for Instruction Tuning on General Tasksby Haoyuan Wu,…
Multimodal Variational Autoencoder: a Barycentric Viewby Peijie Qiu, Wenhui Zhu, Sayantan Kumar, Xiwen Chen, Xiaotong…
Graph Mixture of Experts and Memory-augmented Routers for Multivariate Time Series Anomaly Detectionby Xiaoyu Huang,…
Theory of Mixture-of-Experts for Mobile Edge Computingby Hongbo Li, Lingjie DuanFirst submitted to arxiv on:…
ReMoE: Fully Differentiable Mixture-of-Experts with ReLU Routingby Ziteng Wang, Jun Zhu, Jianfei ChenFirst submitted to…
A Survey on Inference Optimization Techniques for Mixture of Experts Modelsby Jiacheng Liu, Peng Tang,…
Wonderful Matrices: Combining for a More Efficient and Effective Foundation Model Architectureby Jingze Shi, Bingheng…