Loading Now

Summary of Efficient Data Subset Selection to Generalize Training Across Models: Transductive and Inductive Networks, by Eeshaan Jain et al.


Efficient Data Subset Selection to Generalize Training Across Models: Transductive and Inductive Networks

by Eeshaan Jain, Tushar Nandy, Gaurav Aggarwal, Ashish Tendulkar, Rishabh Iyer, Abir De

First submitted to arxiv on: 18 Sep 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Machine Learning (stat.ML)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
A trainable subset selection framework called SubSelNet is proposed to generalize across architectures for efficient learning. The approach uses an attention-based neural gadget as a surrogate for trained deep neural networks, allowing for quick model prediction. This leads to two variants: Transductive-SubSelNet, which computes subsets separately for each model through a small optimization problem, and Inductive-SubSelNet, which uses a trained subset selector without optimization. Experimental results demonstrate the effectiveness of SubSelNet in outperforming several methods across multiple real-world datasets.
Low GrooveSquid.com (original content) Low Difficulty Summary
SubSelNet is a new way to choose the right parts of models to use when learning. Usually, people pick subsets based on specific models, but this doesn’t work if you want to use it with another model. The new approach uses a special tool that helps figure out which parts of models are important and works quickly even for complex models. Two versions of SubSelNet were tested, one that solves a small problem for each model and another that uses a learned solution. Both performed better than other methods on real-world data.

Keywords

* Artificial intelligence  * Attention  * Optimization