Summary of Measuring Sample Importance in Data Pruning For Language Models Based on Information Entropy, by Minsang Kim et al.
Measuring Sample Importance in Data Pruning for Language Models based on Information Entropyby Minsang Kim,…
Measuring Sample Importance in Data Pruning for Language Models based on Information Entropyby Minsang Kim,…
LayerMerge: Neural Network Depth Compression through Layer Pruning and Mergingby Jinuk Kim, Marwa El Halabi,…
Federated Learning with a Single Shared Imageby Sunny Soni, Aaqib Saeed, Yuki M. AsanoFirst submitted…
SCORE: A 1D Reparameterization Technique to Break Bayesian Optimization’s Curse of Dimensionalityby Joseph ChakarFirst submitted…
Attention Score is not All You Need for Token Importance Indicator in KV Cache Reduction:…
Multi-Dimensional Pruning: Joint Channel, Layer and Block Pruning with Latency Constraintby Xinglong Sun, Barath Lakshmanan,…
Not All Prompts Are Made Equal: Prompt-based Pruning of Text-to-Image Diffusion Modelsby Alireza Ganjdanesh, Reza…
Towards Efficient Target-Level Machine Unlearning Based on Essential Graphby Heng Xu, Tianqing Zhu, Lefeng Zhang,…
Bypass Back-propagation: Optimization-based Structural Pruning for Large Language Models via Policy Gradientby Yuan Gao, Zujing…
Pruning is Optimal for Learning Sparse Features in High-Dimensionsby Nuri Mert Vural, Murat A. ErdogduFirst…