Loading Now

Summary of Feature Expansion and Enhanced Compression For Class Incremental Learning, by Quentin Ferdinand (ensta Bretagne et al.


Feature Expansion and enhanced Compression for Class Incremental Learning

by Quentin Ferdinand, Gilles Le Chenadec, Benoit Clement, Panagiotis Papadakis, Quentin Oliveau

First submitted to arxiv on: 13 May 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI); Computer Vision and Pattern Recognition (cs.CV)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper presents a novel algorithm for class-incremental learning, which trains models to classify increasingly complex datasets. The method addresses the problem of catastrophic forgetting by dynamically adding new feature extractors and compressing previous knowledge using a Rehearsal-CutMix approach. The proposed algorithm enhances compression by mixing patches of past class samples with new images, reducing forgetting and improving performance. Experiments on CIFAR and ImageNet datasets demonstrate consistent outperformance compared to state-of-the-art methods.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper helps machines learn new things without forgetting old ones. It’s like a person learning new words in their native language – they don’t forget the old words! The researchers developed a way to make machine learning models better at this by adding new features and compressing the old information. They tested it on lots of images and showed that their method works really well.

Keywords

» Artificial intelligence  » Machine learning