Summary of Improving Smote Via Fusing Conditional Vae For Data-adaptive Noise Filtering, by Sungchul Hong et al.
Improving SMOTE via Fusing Conditional VAE for Data-adaptive Noise Filtering
by Sungchul Hong, Seunghwan An, Jong-June Jeon
First submitted to arxiv on: 30 May 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Artificial Intelligence (cs.AI)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The paper presents an investigation into the use of generative neural network models for class imbalance data augmentation, specifically comparing them to the conventional Synthetic Minority Oversampling Technique (SMOTE). The authors introduce a framework that enhances SMOTE using Variational Autoencoders (VAE) by systematically quantifying data point density in a low-dimensional latent space. This approach excludes potentially degrading data points and directly augments neighboring observations on the data space. Experimental results on several imbalanced datasets show that this simple process innovatively improves SMOTE over deep learning models, concluding that selection of minority data and interpolation in the data space are beneficial for class imbalance problems with limited data. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary The paper explores how to improve a method called SMOTE, which helps balance different classes in data. Researchers compare SMOTE to newer methods based on generative neural networks. They come up with a new way to use Variational Autoencoders (VAE) to make SMOTE better. This new approach looks at the density of data points and uses that information to decide what data points to add or change. The results show that this method works well for class imbalance problems, especially when there’s not much data. |
Keywords
» Artificial intelligence » Data augmentation » Deep learning » Latent space » Neural network