Summary of Learning From Committee: Reasoning Distillation From a Mixture Of Teachers with Peer-review, by Zhuochun Li et al.
Learning from Committee: Reasoning Distillation from a Mixture of Teachers with Peer-Reviewby Zhuochun Li, Yuelyu…
Learning from Committee: Reasoning Distillation from a Mixture of Teachers with Peer-Reviewby Zhuochun Li, Yuelyu…
Efficient Technical Term Translation: A Knowledge Distillation Approach for Parenthetical Terminology Translationby Jiyoon Myung, Jihyeon…
AMR-Evol: Adaptive Modular Response Evolution Elicits Better Knowledge Distillation for Large Language Models in Code…
DSG-KD: Knowledge Distillation from Domain-Specific to General Language Modelsby Sangyeon Cho, Jangyeong Jeon, Dongjoon Lee,…
Distilling Privileged Multimodal Information for Expression Recognition using Optimal Transportby Muhammad Haseeb Aslam, Muhammad Osama…