Summary of Mopd: Mixture-of-prompts Distillation For Vision-language Models, by Yang Chen et al.
MoPD: Mixture-of-Prompts Distillation for Vision-Language Modelsby Yang Chen, Shuai Fu, Yu ZhangFirst submitted to arxiv…
MoPD: Mixture-of-Prompts Distillation for Vision-Language Modelsby Yang Chen, Shuai Fu, Yu ZhangFirst submitted to arxiv…
Recommending Pre-Trained Models for IoT Devicesby Parth V. Patil, Wenxin Jiang, Huiyun Peng, Daniel Lugo,…
Exploring Embedding Priors in Prompt-Tuning for Improved Interpretability and Controlby Sergey Sedov, Sumanth Bharadwaj Hachalli…
Better Knowledge Enhancement for Privacy-Preserving Cross-Project Defect Predictionby Yuying Wang, Yichen Li, Haozhao Wang, Lei…
Generative Diffusion Modeling: A Practical Handbookby Zihan Ding, Chi JinFirst submitted to arxiv on: 22…
Lillama: Large Language Models Compression via Low-Rank Feature Distillationby Yaya Sy, Christophe Cerisara, Irina IllinaFirst…
Novelty-Guided Data Reuse for Efficient and Diversified Multi-Agent Reinforcement Learningby Yangkun Chen, Kai Yang, Jian…
Preventing Local Pitfalls in Vector Quantization via Optimal Transportby Borui Zhang, Wenzhao Zheng, Jie Zhou,…
Toward Efficient Data-Free Unlearningby Chenhao Zhang, Shaofei Shen, Weitong Chen, Miao XuFirst submitted to arxiv…
On Local Overfitting and Forgetting in Deep Neural Networksby Uri Stern, Tomer Yaacoby, Daphna WeinshallFirst…