Summary of Self-supervised Keypoint Detection with Distilled Depth Keypoint Representation, by Aman Anand et al.
Self-Supervised Keypoint Detection with Distilled Depth Keypoint Representationby Aman Anand, Elyas Rashno, Amir Eskandari, Farhana…
Self-Supervised Keypoint Detection with Distilled Depth Keypoint Representationby Aman Anand, Elyas Rashno, Amir Eskandari, Farhana…
Unlearning Backdoor Attacks for LLMs with Weak-to-Strong Knowledge Distillationby Shuai Zhao, Xiaobao Wu, Cong-Duy Nguyen,…
TAS: Distilling Arbitrary Teacher and Student via a Hybrid Assistantby Guopeng Li, Qiang Wang, Ke…
Speculative Knowledge Distillation: Bridging the Teacher-Student Gap Through Interleaved Samplingby Wenda Xu, Rujun Han, Zifeng…
Declarative Knowledge Distillation from Large Language Models for Visual Question Answering Datasetsby Thomas Eiter, Jan…
Mentor-KD: Making Small Language Models Better Multi-step Reasonersby Hojae Lee, Junho Kim, SangKeun LeeFirst submitted…
SNN-PAR: Energy Efficient Pedestrian Attribute Recognition via Spiking Neural Networksby Haiyang Wang, Qian Zhu, Mowen…
CAPEEN: Image Captioning with Early Exits and Knowledge Distillationby Divya Jyoti Bajpai, Manjesh Kumar HanawalFirst…
DAdEE: Unsupervised Domain Adaptation in Early Exit PLMsby Divya Jyoti Bajpai, Manjesh Kumar HanawalFirst submitted…
Accelerating Diffusion Models with One-to-Many Knowledge Distillationby Linfeng Zhang, Kaisheng MaFirst submitted to arxiv on:…