Summary of Bacon: Boosting Imbalanced Semi-supervised Learning Via Balanced Feature-level Contrastive Learning, by Qianhan Feng et al.
BaCon: Boosting Imbalanced Semi-supervised Learning via Balanced Feature-Level Contrastive Learning
by Qianhan Feng, Lujing Xie, Shijie Fang, Tong Lin
First submitted to arxiv on: 4 Mar 2024
Categories
- Main: Computer Vision and Pattern Recognition (cs.CV)
- Secondary: Machine Learning (cs.LG)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary Semi-supervised learning (SSL) has been shown to reduce the need for extensive annotations in deep learning, but it often relies on imbalanced data distributions, which can be exacerbated by unreliable pseudo-labels. Most existing methods address this issue at the instance level through reweighting or resampling, but their performance is limited by their reliance on biased backbone representation. To overcome these limitations, we propose a Balanced Feature-Level Contrastive Learning method (BaCon) that directly regularizes the distribution of instances’ representations in a well-designed contrastive manner. Our method computes class-wise feature centers as positive anchors and selects negative anchors through a straightforward yet effective mechanism. A distribution-related temperature adjustment is leveraged to control the class-wise contrastive degrees dynamically. Comprehensive experiments on the CIFAR10-LT, CIFAR100-LT, STL10-LT, and SVHN-LT datasets demonstrate the effectiveness of BaCon, which surpasses instance-level methods like FixMatch-based ABC and feature-level methods like CoSSL. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary Semi-supervised learning tries to use a small amount of labeled data to improve how well an AI model works. But this approach has a big problem: when there are way more examples of one class than others, it can make the model worse. To fix this, we came up with a new method called BaCon. It helps by making sure that the features learned from the data are balanced and fair. We tested BaCon on several datasets and found that it works better than other methods in many cases. |
Keywords
* Artificial intelligence * Deep learning * Semi supervised * Temperature