Summary of Pretrained Generative Language Models As General Learning Frameworks For Sequence-based Tasks, by Ben Fauber
Pretrained Generative Language Models as General Learning Frameworks for Sequence-Based Tasksby Ben FauberFirst submitted to…
Pretrained Generative Language Models as General Learning Frameworks for Sequence-Based Tasksby Ben FauberFirst submitted to…
In-Context Learning Can Re-learn Forbidden Tasksby Sophie Xhonneux, David Dobre, Jian Tang, Gauthier Gidel, Dhanya…
Implicit Diffusion: Efficient Optimization through Stochastic Samplingby Pierre Marion, Anna Korba, Peter Bartlett, Mathieu Blondel,…
ApiQ: Finetuning of 2-Bit Quantized Large Language Modelby Baohao Liao, Christian Herold, Shahram Khadivi, Christof…
Assessing the Brittleness of Safety Alignment via Pruning and Low-Rank Modificationsby Boyi Wei, Kaixuan Huang,…
Multi-Patch Prediction: Adapting LLMs for Time Series Representation Learningby Yuxuan Bian, Xuan Ju, Jiangtong Li,…
L4Q: Parameter Efficient Quantization-Aware Fine-Tuning on Large Language Modelsby Hyesung Jeon, Yulhwa Kim, Jae-joon KimFirst…
Source-Free Domain Adaptation with Diffusion-Guided Source Data Generationby Shivang Chopra, Suraj Kothawade, Houda Aynaou, Aman…
Curvature-Informed SGD via General Purpose Lie-Group Preconditionersby Omead Pooladzandi, Xi-Lin LiFirst submitted to arxiv on:…
InfLLM: Training-Free Long-Context Extrapolation for LLMs with an Efficient Context Memoryby Chaojun Xiao, Pengle Zhang,…