Summary of Loca: Logit Calibration For Knowledge Distillation, by Runming Yang et al.
LoCa: Logit Calibration for Knowledge Distillationby Runming Yang, Taiqiang Wu, Yujiu YangFirst submitted to arxiv…
LoCa: Logit Calibration for Knowledge Distillationby Runming Yang, Taiqiang Wu, Yujiu YangFirst submitted to arxiv…
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architectureby Qianlong Xiang, Miao Zhang, Yuzhang…
Low-Resolution Face Recognition via Adaptable Instance-Relation Distillationby Ruixin Shi, Weijia Guo, Shiming GeFirst submitted to…
Compressing VAE-Based Out-of-Distribution Detectors for Embedded Deploymentby Aditya Bansal, Michael Yuhas, Arvind EaswaranFirst submitted to…
TSAK: Two-Stage Semantic-Aware Knowledge Distillation for Efficient Wearable Modality and Model Optimization in Manufacturing Linesby…
Condensed Sample-Guided Model Inversion for Knowledge Distillationby Kuluhan Binici, Shivam Aggarwal, Cihan Acar, Nam Trung…
Foundational Model for Electron Micrograph Analysis: Instruction-Tuning Small-Scale Language-and-Vision Assistant for Enterprise Adoptionby Sakhinana Sagar…
Aligning (Medical) LLMs for (Counterfactual) Fairnessby Raphael Poulain, Hamed Fayyaz, Rahmatollah BeheshtiFirst submitted to arxiv…
LAKD-Activation Mapping Distillation Based on Local Learningby Yaoze Zhang, Yuming Zhang, Yu Zhao, Yue Zhang,…
A Unified Framework for Continual Learning and Unlearningby Romit Chatterjee, Vikram Chundawat, Ayush Tarun, Ankur…