Summary of A Survey on Symbolic Knowledge Distillation Of Large Language Models, by Kamal Acharya et al.
A Survey on Symbolic Knowledge Distillation of Large Language Modelsby Kamal Acharya, Alvaro Velasquez, Houbing…
A Survey on Symbolic Knowledge Distillation of Large Language Modelsby Kamal Acharya, Alvaro Velasquez, Houbing…
FedQUIT: On-Device Federated Unlearning via a Quasi-Competent Virtual Teacherby Alessio Mora, Lorenzo Valerio, Paolo Bellavista,…
DisCoM-KD: Cross-Modal Knowledge Distillation via Disentanglement Representation and Adversarial Learningby Dino Ienco, Cassio Fraga DantasFirst…
Low-Dimensional Federated Knowledge Graph Embedding via Knowledge Distillationby Xiaoxiong Zhang, Zhiwei Zeng, Xin Zhou, Zhiqi…