Loading Now

Summary of Fairness Under Cover: Evaluating the Impact Of Occlusions on Demographic Bias in Facial Recognition, by Rafael M. Mamede et al.


Fairness Under Cover: Evaluating the Impact of Occlusions on Demographic Bias in Facial Recognition

by Rafael M. Mamede, Pedro C. Neto, Ana F. Sequeira

First submitted to arxiv on: 19 Aug 2024

Categories

  • Main: Computer Vision and Pattern Recognition (cs.CV)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This study delves into the impact of occlusions on face recognition systems’ fairness, with a focus on demographic biases. The researchers employed the Racial Faces in the Wild (RFW) dataset and added realistic occlusions to evaluate their effect on models trained on BUPT-Balanced and BUPT-GlobalFace datasets. They found increased dispersion in FMR, FNMR, and accuracy, alongside decreased fairness according to Equilized Odds, Demographic Parity, STD of Accuracy, and Fairness Discrepancy Rate. A pixel attribution method was used to understand the importance of occlusions in model predictions, introducing a new metric called Face Occlusion Impact Ratio (FOIR) that quantifies the extent to which occlusions affect model performance across different demographic groups. Results show that occlusions exacerbate existing biases, with models placing higher emphasis on occlusions in an unequal manner, particularly affecting African individuals more severely.
Low GrooveSquid.com (original content) Low Difficulty Summary
This study looks at how face recognition systems work when there are objects covering people’s faces. The researchers used a special dataset and added fake coverings to see how this affects the system’s performance. They found that these coverings make it harder for the system to be fair, especially when it comes to certain groups of people like African individuals.

Keywords

» Artificial intelligence  » Face recognition