Summary of All-around Neural Collapse For Imbalanced Classification, by Enhao Zhang et al.
All-around Neural Collapse for Imbalanced Classification
by Enhao Zhang, Chaohua Li, Chuanxing Geng, Songcan Chen
First submitted to arxiv on: 14 Aug 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Computer Vision and Pattern Recognition (cs.CV)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary In a neural network, Neural Collapse (NC) allows individual activations (features), class means, and classifier (weights) vectors to reach optimal inter-class separability during the final phase of training on a balanced dataset. This structure can be easily disrupted when shifted to imbalanced classification, resulting in minority collapse, where classifiers for minority classes are squeezed. Existing methods focus on optimizing classifiers to recover NC, but we find that this squeezing also affects class means. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary When you train a neural network, the different parts of it work together to separate things into groups (like cats and dogs). This is called Neural Collapse. If everything works perfectly, the network can easily tell apart different groups. But if some groups have way more things than others, this perfection gets ruined. That’s called minority collapse. Usually, people try to fix this by changing how the network makes decisions. But we discovered that this problem also happens with the average features of each group. |
Keywords
» Artificial intelligence » Classification » Neural network