Summary of Distilling the Knowledge in Data Pruning, by Emanuel Ben-baruch et al.
Distilling the Knowledge in Data Pruningby Emanuel Ben-Baruch, Adam Botach, Igor Kviatkovsky, Manoj Aggarwal, GĂ©rard…
Distilling the Knowledge in Data Pruningby Emanuel Ben-Baruch, Adam Botach, Igor Kviatkovsky, Manoj Aggarwal, GĂ©rard…
One Category One Prompt: Dataset Distillation using Diffusion Modelsby Ali Abbasi, Ashkan Shahbazi, Hamed Pirsiavash,…
AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillationby Zihao Tang, Zheqi Lv, Shengyu Zhang, Yifan…
Cooperative Classification and Rationalization for Graph Generalizationby Linan Yue, Qi Liu, Ye Liu, Weibo Gao,…
A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillationby Lirong Wu, Haitao Lin, Zhangyang Gao,…
Distilled ChatGPT Topic & Sentiment Modeling with Applications in Financeby Olivier Gandouet, Mouloud Belbahri, Armelle…
Hyperspectral Image Analysis in Single-Modal and Multimodal setting using Deep Learning Techniquesby Shivam PandeFirst submitted…
Direct Alignment of Draft Model for Speculative Decoding with Chat-Fine-Tuned LLMsby Raghavv Goel, Mukul Gagrani,…
Differentially Private Knowledge Distillation via Synthetic Text Generationby James Flemings, Murali AnnavaramFirst submitted to arxiv…
EBBS: An Ensemble with Bi-Level Beam Search for Zero-Shot Machine Translationby Yuqiao Wen, Behzad Shayegh,…