Loading Now

Summary of User Identification Via Free Roaming Eye Tracking Data, by Rishabh Vallabh Varsha Haria et al.


User Identification via Free Roaming Eye Tracking Data

by Rishabh Vallabh Varsha Haria, Amin El Abed, Sebastian Maneth

First submitted to arxiv on: 14 Mar 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Human-Computer Interaction (cs.HC)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The paper presents a new dataset for user identification using eye movements recorded with a wearable eye tracker on a university campus. The dataset consists of “free roaming” and “targeted roaming” tasks, where participants are asked to walk around the campus or find a specific room in a library. The authors use a Radial Basis Function Network (RBFN) as a classifier and achieve highest accuracies of 87.3% for free roaming and 89.4% for targeted roaming. These results compare favorably to those achieved in laboratory settings, which often require specialized equipment and may not be feasible in real-world scenarios. The authors also investigate the impact of including higher order velocity derivatives on user identification accuracy.
Low GrooveSquid.com (original content) Low Difficulty Summary
This research paper looks at how well a computer program can figure out who is doing what just by tracking their eye movements while they walk around or search for something specific. To do this, 41 people walked around a university campus or searched for a room in a library. Their eye movements were recorded using a special device called a wearable eye tracker. The researchers used a special kind of computer model to try to figure out who was doing what based on the eye movement data. They found that the model could accurately identify the person 87.3% of the time when they walked around and 89.4% of the time when they searched for something specific. This is pretty good compared to other studies that did similar things in a lab setting.

Keywords

* Artificial intelligence  * Tracking