Summary of Learning From Committee: Reasoning Distillation From a Mixture Of Teachers with Peer-review, by Zhuochun Li et al.
Learning from Committee: Reasoning Distillation from a Mixture of Teachers with Peer-Reviewby Zhuochun Li, Yuelyu…
Learning from Committee: Reasoning Distillation from a Mixture of Teachers with Peer-Reviewby Zhuochun Li, Yuelyu…
AMR-Evol: Adaptive Modular Response Evolution Elicits Better Knowledge Distillation for Large Language Models in Code…
Efficient Technical Term Translation: A Knowledge Distillation Approach for Parenthetical Terminology Translationby Jiyoon Myung, Jihyeon…
DSG-KD: Knowledge Distillation from Domain-Specific to General Language Modelsby Sangyeon Cho, Jangyeong Jeon, Dongjoon Lee,…
DilateQuant: Accurate and Efficient Diffusion Quantization via Weight Dilationby Xuewen Liu, Zhikai Li, Qingyi GuFirst…
Simple Unsupervised Knowledge Distillation With Space Similarityby Aditya Singh, Haohan WangFirst submitted to arxiv on:…
LLMR: Knowledge Distillation with a Large Language Model-Induced Rewardby Dongheng Li, Yongchang Hao, Lili MouFirst…
EFCM: Efficient Fine-tuning on Compressed Models for deployment of large models in medical image analysisby…
LEROjD: Lidar Extended Radar-Only Object Detectionby Patrick Palmer, Martin Krüger, Stefan Schütte, Richard Altendorfer, Ganesh…
Improving Apple Object Detection with Occlusion-Enhanced Distillationby Liang GengFirst submitted to arxiv on: 3 Sep…