Loading Now

Summary of Granular-ball Representation Learning For Deep Cnn on Learning with Label Noise, by Dawei Dai and Hao Zhu and Shuyin Xia and Guoyin Wang


Granular-ball Representation Learning for Deep CNN on Learning with Label Noise

by Dawei Dai, Hao Zhu, Shuyin Xia, Guoyin Wang

First submitted to arxiv on: 5 Sep 2024

Categories

  • Main: Computer Vision and Pattern Recognition (cs.CV)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The abstract proposes a novel approach to address label noise in deep convolutional neural network (CNN) models, which is a common issue in training data. The existing solutions typically require data cleaning or designing additional optimizations to punish mislabeled data, but these methods can weaken or lose some data during the training process. To overcome this limitation, the authors propose a general granular-ball computing (GBC) module that can be embedded into a CNN model, where the classifier predicts labels for granular-ball samples instead of individual samples. The GBC module is designed to split input samples at feature-level, propagate gradients normally during backpropagation, and use an experience replay policy to ensure training stability. Experimental results demonstrate that the proposed method improves the robustness of CNN models without requiring additional data or optimizations.
Low GrooveSquid.com (original content) Low Difficulty Summary
The paper tackles a common problem in machine learning: label noise in deep CNN models. Label noise is when training data is mislabeled, which can make the model less effective. The authors show how to fix this by using a new module called general granular-ball computing (GBC). This module helps the model learn from groups of similar samples instead of individual ones. It works by splitting input samples into smaller groups and predicting labels for those groups. The authors also show that their method can be used without adding extra data or making any big changes to the training process.

Keywords

» Artificial intelligence  » Backpropagation  » Cnn  » Machine learning  » Neural network