Loading Now

Summary of Evaluating Graph-based Explanations For Ai-based Recommender Systems, by Simon Delarue et al.


Evaluating graph-based explanations for AI-based recommender systems

by Simon Delarue, Astrid Bertrand, Tiphaine Viard

First submitted to arxiv on: 17 Jul 2024

Categories

  • Main: Artificial Intelligence (cs.AI)
  • Secondary: Human-Computer Interaction (cs.HC)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The paper explores the effectiveness of graph-based explanations in improving users’ perception of AI-based recommendations. It aims to determine whether graph-based explanations can enhance user understanding, usability, and curiosity towards the AI system. The study involves both qualitative and quantitative approaches, where users’ requirements for graph explanations are collected through a qualitative study, followed by a larger quantitative study evaluating the influence of various explanation designs on user perception. The findings suggest that while users prefer graph-based explanations, textual explanations lead to higher objective understanding. However, users’ actual ratings indicate lower satisfaction with graph-based explanations compared to textual design.
Low GrooveSquid.com (original content) Low Difficulty Summary
The paper is about how well people understand AI recommendations when they’re explained in different ways. Researchers want to know if using pictures and diagrams (called graphs) can help people get a better grasp of what the AI system is doing. They did two studies: one where they asked people what they wanted from graph explanations, and another where they tested different explanation methods to see how well people understood them. The results show that people like the idea of using graphs, but when it comes down to it, they prefer written explanations that help them understand things better.

Keywords

» Artificial intelligence