Loading Now

Summary of Isee: Advancing Multi-shot Explainable Ai Using Case-based Recommendations, by Anjana Wijekoon et al.


iSee: Advancing Multi-Shot Explainable AI Using Case-based Recommendations

by Anjana Wijekoon, Nirmalie Wiratunga, David Corsar, Kyle Martin, Ikechukwu Nkisi-Orji, Chamath Palihawadana, Marta Caro-Martínez, Belen Díaz-Agudo, Derek Bridge, Anne Liret

First submitted to arxiv on: 23 Aug 2024

Categories

  • Main: Artificial Intelligence (cs.AI)
  • Secondary: Human-Computer Interaction (cs.HC); Information Retrieval (cs.IR)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
A novel approach is proposed for explainable AI (XAI) in decision-making processes, which involves a combination of explainers tailored to individual users or groups. This “explanation strategy” aims to enhance user trust and satisfaction by providing personalized explanations. The iSee platform is designed to facilitate the sharing and reuse of explanation experiences, using Case-based Reasoning to advance best practices in XAI. The platform enables designers to iteratively revise the most suitable explanation strategy for their AI system. A mixed-methods study evaluates the usability and utility of the iSee platform with six design users across varying levels of AI and XAI expertise.
Low GrooveSquid.com (original content) Low Difficulty Summary
XAI can make AI decision-making more trustworthy. Right now, a single explainer might not be enough to explain complex AI decisions. This paper shows that using multiple explainers in combination, or “explanation strategy”, can help. The idea is to personalize explanations for individual users or groups based on their needs and expertise. A special platform called iSee helps design these explanation strategies by sharing and reusing experiences. It’s like a library of best practices for making AI more understandable.

Keywords

» Artificial intelligence