Summary of Banishing Llm Hallucinations Requires Rethinking Generalization, by Johnny Li et al.
Banishing LLM Hallucinations Requires Rethinking Generalizationby Johnny Li, Saksham Consul, Eda Zhou, James Wong, Naila…
Banishing LLM Hallucinations Requires Rethinking Generalizationby Johnny Li, Saksham Consul, Eda Zhou, James Wong, Naila…
ObjectNLQ @ Ego4D Episodic Memory Challenge 2024by Yisen Feng, Haoyu Zhang, Yuquan Xie, Zaijing Li,…
Mitigating Object Hallucinations in Large Vision-Language Models with Assembly of Global and Local Attentionby Wenbin…
WellDunn: On the Robustness and Explainability of Language Models and Large Language Models in Identifying…
GUICourse: From General Vision Language Models to Versatile GUI Agentsby Wentong Chen, Junbo Cui, Jinyi…
TorchOpera: A Compound AI System for LLM Safetyby Shanshan Han, Zijian Hu, Alay Dilipbhai Shah,…
MMScan: A Multi-Modal 3D Scene Dataset with Hierarchical Grounded Language Annotationsby Ruiyuan Lyu, Tai Wang,…
Surprise! Using Physiological Stress for Allostatic Regulation Under the Active Inference Framework [Pre-Print]by Imran Khan,…
Situated Ground Truths: Enhancing Bias-Aware AI by Situating Data Labels with SituAnnotateby Delfina Sol Martinez…
Solution for SMART-101 Challenge of CVPR Multi-modal Algorithmic Reasoning Task 2024by Jinwoo Ahn, Junhyeok Park,…