Summary of Llm-radjudge: Achieving Radiologist-level Evaluation For X-ray Report Generation, by Zilong Wang et al.
LLM-RadJudge: Achieving Radiologist-Level Evaluation for X-Ray Report Generationby Zilong Wang, Xufang Luo, Xinyang Jiang, Dongsheng…
LLM-RadJudge: Achieving Radiologist-Level Evaluation for X-Ray Report Generationby Zilong Wang, Xufang Luo, Xinyang Jiang, Dongsheng…
Gecko: Versatile Text Embeddings Distilled from Large Language Modelsby Jinhyuk Lee, Zhuyun Dai, Xiaoqi Ren,…
Learning to Project for Cross-Task Knowledge Distillationby Dylan Auty, Roy Miles, Benedikt Kolbeinsson, Krystian MikolajczykFirst…
Scale Decoupled Distillationby Shicai Wei Chunbo Luo Yang LuoFirst submitted to arxiv on: 20 Mar…
Knowledge Distillation in YOLOX-ViT for Side-Scan Sonar Object Detectionby Martin Aubard, László Antal, Ana Madureira,…
Continual All-in-One Adverse Weather Removal with Knowledge Replay on a Unified Network Structureby De Cheng,…
MEND: Meta dEmonstratioN Distillation for Efficient and Effective In-Context Learningby Yichuan Li, Xiyao Ma, Sixing…
MKF-ADS: Multi-Knowledge Fusion Based Self-supervised Anomaly Detection System for Control Area Networkby Pengzhou Cheng, Zongru…
Privacy-preserving Fine-tuning of Large Language Models through Flatnessby Tiejin Chen, Longchao Da, Huixue Zhou, Pingzhi…
Learning to Maximize Mutual Information for Chain-of-Thought Distillationby Xin Chen, Hanxian Huang, Yanjun Gao, Yi…