Summary of Tagcos: Task-agnostic Gradient Clustered Coreset Selection For Instruction Tuning Data, by Jipeng Zhang et al.
TAGCOS: Task-agnostic Gradient Clustered Coreset Selection for Instruction Tuning Data
by Jipeng Zhang, Yaxuan Qin, Renjie Pi, Weizhong Zhang, Rui Pan, Tong Zhang
First submitted to arxiv on: 21 Jul 2024
Categories
- Main: Computation and Language (cs.CL)
- Secondary: Artificial Intelligence (cs.AI); Machine Learning (cs.LG)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary This paper proposes Task-Agnostic Gradient Clustered COreset Selection (TAGCOS), a novel method for extracting a small subset (Coreset) from large instruction datasets, achieving comparable performance to the full dataset. The approach leverages sample gradients as data representations, clusters similar data, and employs an efficient greedy algorithm for coreset selection. TAGCOS addresses challenges in selecting a representative Coreset from diverse instruction datasets. Experimental results demonstrate that our method outperforms other unsupervised methods, even when selecting only 5% of the data. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary This paper is about finding a small but important part of big language datasets that can do the same job as the whole dataset. This helps computers learn faster and more efficiently. The authors developed a new way to select this “Coreset” using gradients (small changes in data) and group similar data together. They tested their method on various large datasets and found it works well, even when choosing only a small part of the data. |
Keywords
» Artificial intelligence » Unsupervised