Summary of Small Models Are Llm Knowledge Triggers on Medical Tabular Prediction, by Jiahuan Yan et al.
Small Models are LLM Knowledge Triggers on Medical Tabular Predictionby Jiahuan Yan, Jintai Chen, Chaowen…
Small Models are LLM Knowledge Triggers on Medical Tabular Predictionby Jiahuan Yan, Jintai Chen, Chaowen…
Differentially Private Knowledge Distillation via Synthetic Text Generationby James Flemings, Murali AnnavaramFirst submitted to arxiv…
Teacher-Student Learning on Complexity in Intelligent Routingby Shu-Ting Pi, Michael Yang, Yuying Zhu, Qun LiuFirst…
Co-Supervised Learning: Improving Weak-to-Strong Generalization with Hierarchical Mixture of Expertsby Yuejiang Liu, Alexandre AlahiFirst submitted…
Wisdom of Committee: Distilling from Foundation Model to Specialized Application Modelby Zichang Liu, Qingyun Liu,…
PaCKD: Pattern-Clustered Knowledge Distillation for Compressing Memory Access Prediction Modelsby Neelesh Gupta, Pengmiao Zhang, Rajgopal…
Teacher as a Lenient Expert: Teacher-Agnostic Data-Free Knowledge Distillationby Hyunjune Shin, Dong-Wan ChoiFirst submitted to…
Distilling Large Language Models for Text-Attributed Graph Learningby Bo Pan, Zheng Zhang, Yifei Zhang, Yuntong…
GraphKD: Exploring Knowledge Distillation Towards Document Object Detection with Structured Graph Creationby Ayan Banerjee, Sanket…
Knowledge Distillation Based on Transformed Teacher Matchingby Kaixiang Zheng, En-Hui YangFirst submitted to arxiv on:…