Loading Now

Summary of Practical Dataset Distillation Based on Deep Support Vectors, by Hyunho Lee et al.


Practical Dataset Distillation Based on Deep Support Vectors

by Hyunho Lee, Junhoo Lee, Nojun Kwak

First submitted to arxiv on: 1 May 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The proposed novel distillation method incorporates general model knowledge via the addition of Deep KKT (DKKT) loss, which shows improved performance compared to the baseline distribution matching distillation method on the CIFAR-10 dataset. The approach augments conventional dataset distillation by leveraging only a fraction of the entire dataset, making it practical for real-world scenarios. By incorporating DKKT loss and Deep Support Vectors (DSVs), the method offers enhanced performance, highlighting its potential in various applications.
Low GrooveSquid.com (original content) Low Difficulty Summary
In this paper, researchers developed a new way to reduce the size of datasets without sacrificing their quality. They created a technique that uses only a small part of the dataset, making it more realistic for real-world scenarios where data is often distributed and not stored in one place. The approach combines two ideas: Deep KKT loss and Deep Support Vectors (DSVs). These combined methods improved performance on a well-known dataset called CIFAR-10.

Keywords

» Artificial intelligence  » Distillation