Summary of Breaking Class Barriers: Efficient Dataset Distillation Via Inter-class Feature Compensator, by Xin Zhang et al.
Breaking Class Barriers: Efficient Dataset Distillation via Inter-Class Feature Compensator
by Xin Zhang, Jiawei Du, Ping Liu, Joey Tianyi Zhou
First submitted to arxiv on: 13 Aug 2024
Categories
- Main: Computer Vision and Pattern Recognition (cs.CV)
- Secondary: Machine Learning (cs.LG)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary This paper presents Inter-class Feature Compensator (INFER), a novel dataset distillation approach that transcends the class-specific data-label framework prevalent in current methods. INFER leverages a Universal Feature Compensator (UFC) to enhance feature integration across classes, generating multiple synthetic instances from a single UFC input. This improves efficiency and effectiveness by reducing the size of soft labels in the synthetic dataset to almost zero. INFER demonstrates state-of-the-art performance on benchmark datasets like ImageNet-1k, outperforming SRe2L by 34.5% using ResNet18. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary Dataset distillation is a technique that condenses features from large natural datasets into a compact and synthetic form. However, current methods are limited by the class-specific synthesis paradigm, which only optimizes data for a pre-assigned label. This paper presents INFER, an innovative approach that breaks this barrier by using a Universal Feature Compensator (UFC) to integrate features across classes. This allows for more efficient and effective distillation, resulting in synthetic data with almost zero soft labels. |
Keywords
» Artificial intelligence » Distillation » Synthetic data