Loading Now

Summary of Enabling Collaborative Clinical Diagnosis Of Infectious Keratitis by Integrating Expert Knowledge and Interpretable Data-driven Intelligence, By Zhengqing Fang et al.


Enabling Collaborative Clinical Diagnosis of Infectious Keratitis by Integrating Expert Knowledge and Interpretable Data-driven Intelligence

by Zhengqing Fang, Shuowen Zhou, Zhouhang Yuan, Yuxuan Si, Mengze Li, Jinxu Li, Yesheng Xu, Wenjia Xie, Kun Kuang, Yingming Li, Fei Wu, Yu-Feng Yao

First submitted to arxiv on: 14 Jan 2024

Categories

  • Main: Artificial Intelligence (cs.AI)
  • Secondary: Computer Vision and Pattern Recognition (cs.CV); Human-Computer Interaction (cs.HC)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper proposes an interpretable model called knowledge-guided diagnosis model (KGDM) that provides visualized reasoning processes containing AI-based biomarkers and retrieved cases with similar diagnostic patterns. KGDM integrates clinicians’ prompts into the interpreted reasoning through human-AI interaction, enhancing safety and accuracy. The study evaluates KGDM’s performance, interpretability, and clinical utility in diagnosing infectious keratitis (IK), a leading cause of corneal blindness. The model achieves effective diagnostic odds ratios (DOR) ranging from 3.011 to 35.233, consistent with clinic experience. A human-AI collaborative diagnosis test shows participants’ performance exceeds both humans and AI when collaborating. By combining interpretability and interaction, the study facilitates the convergence of clinicians’ expertise and data-driven intelligence.
Low GrooveSquid.com (original content) Low Difficulty Summary
The paper develops an interpretable model that helps doctors understand how AI makes medical diagnoses. This is important because AI models are often “black boxes” that don’t explain their decisions. The new model provides a visualized reasoning process with AI-based biomarkers and retrieved cases, allowing doctors to see the thought process behind the diagnosis. Doctors can then use this information to make better decisions and improve patient care.

Keywords

» Artificial intelligence