Summary of Multi-treatment Multi-task Uplift Modeling For Enhancing User Growth, by Yuxiang Wei et al.
Multi-Treatment Multi-Task Uplift Modeling for Enhancing User Growthby Yuxiang Wei, Zhaoxin Qiu, Yingjie Li, Yuke…
Multi-Treatment Multi-Task Uplift Modeling for Enhancing User Growthby Yuxiang Wei, Zhaoxin Qiu, Yingjie Li, Yuke…
FactorLLM: Factorizing Knowledge via Mixture of Experts for Large Language Modelsby Zhongyu Zhao, Menghang Dong,…
FedMoE: Personalized Federated Learning via Heterogeneous Mixture of Expertsby Hanzi Mei, Dongqi Cai, Ao Zhou,…
HMoE: Heterogeneous Mixture of Experts for Language Modelingby An Wang, Xingwu Sun, Ruobing Xie, Shuaipeng…
AnyGraph: Graph Foundation Model in the Wildby Lianghao Xia, Chao HuangFirst submitted to arxiv on:…
Navigating Spatio-Temporal Heterogeneity: A Graph Transformer Approach for Traffic Forecastingby Jianxiang Zhou, Erdong Liu, Wei…
SMILE: Zero-Shot Sparse Mixture of Low-Rank Experts Construction From Pre-Trained Foundation Modelsby Anke Tang, Li…
AdapMoE: Adaptive Sensitivity-based Expert Gating and Management for Efficient MoE Inferenceby Shuzhang Zhong, Ling Liang,…
BAM! Just Like That: Simple and Efficient Parameter Upcycling for Mixture of Expertsby Qizhen Zhang,…
A Survey on Model MoErging: Recycling and Routing Among Specialized Experts for Collaborative Learningby Prateek…