Summary of Group Distributionally Robust Dataset Distillation with Risk Minimization, by Saeed Vahidian et al.
Group Distributionally Robust Dataset Distillation with Risk Minimizationby Saeed Vahidian, Mingyu Wang, Jianyang Gu, Vyacheslav…
Group Distributionally Robust Dataset Distillation with Risk Minimizationby Saeed Vahidian, Mingyu Wang, Jianyang Gu, Vyacheslav…
Amortized Planning with Large-Scale Transformers: A Case Study on Chessby Anian Ruoss, Grégoire Delétang, Sourabh…
How Good is a Single Basin?by Kai Lion, Lorenzo Noci, Thomas Hofmann, Gregor BachmannFirst submitted…
DFML: Decentralized Federated Mutual Learningby Yasser H. Khalil, Amir H. Estiri, Mahdi Beitollahi, Nader Asadi,…
Efficient Prompt Caching via Embedding Similarityby Hanlin Zhu, Banghua Zhu, Jiantao JiaoFirst submitted to arxiv…
MoDE: A Mixture-of-Experts Model with Mutual Distillation among the Expertsby Zhitian Xie, Yinger Zhang, Chenyi…
EPSD: Early Pruning with Self-Distillation for Efficient Model Compressionby Dong Chen, Ning Liu, Yichen Zhu,…
Spectral Co-Distillation for Personalized Federated Learningby Zihan Chen, Howard H. Yang, Tony Q.S. Quek, Kai…
Importance-Aware Adaptive Dataset Distillationby Guang Li, Ren Togo, Takahiro Ogawa, Miki HaseyamaFirst submitted to arxiv…
Exploration and Anti-Exploration with Distributional Random Network Distillationby Kai Yang, Jian Tao, Jiafei Lyu, Xiu…