Summary of Efficient Llm Context Distillation, by Rajesh Upadhayayaya et al.
Efficient LLM Context Distillationby Rajesh Upadhayayaya, Zachary Smith, Chritopher Kottmyer, Manish Raj OstiFirst submitted to…
Efficient LLM Context Distillationby Rajesh Upadhayayaya, Zachary Smith, Chritopher Kottmyer, Manish Raj OstiFirst submitted to…
Dataset Distillation from First Principles: Integrating Core Information Extraction and Purposeful Learningby Vyacheslav Kungurtsev, Yuanfang…
Error-controlled non-additive interaction discovery in machine learning modelsby Winston Chen, Yifan Jiang, William Stafford Noble,…
Instant Adversarial Purification with Adversarial Consistency Distillationby Chun Tong Lei, Hon Ming Yam, Zhongliang Guo,…
GSTAM: Efficient Graph Distillation with Structural Attention-Matchingby Arash Rasti-Meymandi, Ahmad Sajedi, Zhaopan Xu, Konstantinos N.…
Learning Harmonized Representations for Speculative Samplingby Lefan Zhang, Xiaodan Wang, Yanhua Huang, Ruiwen XuFirst submitted…
Boosting Lossless Speculative Decoding via Feature Sampling and Partial Alignment Distillationby Lujun Gui, Bin Xiao,…
The Mamba in the Llama: Distilling and Accelerating Hybrid Modelsby Junxiong Wang, Daniele Paliotta, Avner…
Learning Differentially Private Diffusion Models via Stochastic Adversarial Distillationby Bochao Liu, Pengju Wang, Shiming GeFirst…
Distilling Long-tailed Datasetsby Zhenghao Zhao, Haoxuan Wang, Yuzhang Shang, Kai Wang, Yan YanFirst submitted to…