Summary of Few and Fewer: Learning Better From Few Examples Using Fewer Base Classes, by Raphael Lafargue et al.
Few and Fewer: Learning Better from Few Examples Using Fewer Base Classes
by Raphael Lafargue, Yassir Bendou, Bastien Pasdeloup, Jean-Philippe Diguet, Ian Reid, Vincent Gripon, Jack Valmadre
First submitted to arxiv on: 29 Jan 2024
Categories
- Main: Computer Vision and Pattern Recognition (cs.CV)
- Secondary: Artificial Intelligence (cs.AI)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary When training data is scarce, it’s common to utilize a pre-trained feature extractor. However, fine-tuning is ineffective for few-shot learning since the target dataset contains only a handful of examples. Instead, directly adopting features without fine-tuning relies on the base and target distributions being similar enough that these features achieve separability and generalization. This paper investigates whether better features can be obtained by training on fewer base classes to identify a more useful base dataset for cross-domain few-shot image classification in eight different domains from Meta-Dataset. By exploring multiple real-world settings (domain-informed, task-informed, and uninformed) where progressively less detail is known about the target task, this study demonstrates that fine-tuning on a subset of carefully selected base classes can significantly improve few-shot learning. The contributions are simple and intuitive methods that can be implemented in any few-shot solution. Additionally, insights are provided into the conditions where these solutions are likely to provide a boost in accuracy. The code to reproduce all experiments is released on GitHub. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary When there’s not much training data, people often use a feature extractor that was trained on a lot of data already. But if you only have a few examples, fine-tuning doesn’t work well. Instead, you can just use the features without changing them, as long as they’re good enough to separate and generalize between the base and target datasets. This paper looks at whether you can get even better features by training on fewer classes in the base dataset. The study shows that if you fine-tune the feature extractor on a few carefully chosen base classes, it can really help with few-shot learning. The methods are simple and easy to use in any few-shot solution. It also gives some ideas about when these solutions might work well. |
Keywords
» Artificial intelligence » Few shot » Fine tuning » Generalization » Image classification