Summary of Multimodal Clinical Reasoning Through Knowledge-augmented Rationale Generation, by Shuai Niu et al.
Multimodal Clinical Reasoning through Knowledge-augmented Rationale Generationby Shuai Niu, Jing Ma, Liang Bai, Zhihua Wang,…
Multimodal Clinical Reasoning through Knowledge-augmented Rationale Generationby Shuai Niu, Jing Ma, Liang Bai, Zhihua Wang,…
DreamPolish: Domain Score Distillation With Progressive Geometry Generationby Yean Cheng, Ziqi Cai, Ming Ding, Wendi…
Stealthy Jailbreak Attacks on Large Language Models via Benign Data Mirroringby Honglin Mu, Han He,…
AI Readiness in Healthcare through Storytelling XAIby Akshat Dubey, Zewen Yang, Georges HattabFirst submitted to…
SIKeD: Self-guided Iterative Knowledge Distillation for mathematical reasoningby Shivam Adarsh, Kumar Shridhar, Caglar Gulcehre, Nicholas…
Emphasizing Discriminative Features for Dataset Distillation in Complex Scenariosby Kai Wang, Zekai Li, Zhi-Qi Cheng,…
SleepCoT: A Lightweight Personalized Sleep Health Model via Chain-of-Thought Distillationby Huimin Zheng, Xiaofeng Xing, Xiangmin…
Pre-training Distillation for Large Language Models: A Design Space Explorationby Hao Peng, Xin Lv, Yushi…
WildOcc: A Benchmark for Off-Road 3D Semantic Occupancy Predictionby Heng Zhai, Jilin Mei, Chen Min,…
Enhancing Multimodal Sentiment Analysis for Missing Modality through Self-Distillation and Unified Modality Cross-Attentionby Yuzhe Weng,…