Summary of Infini-gram: Scaling Unbounded N-gram Language Models to a Trillion Tokens, by Jiacheng Liu et al.
Infini-gram: Scaling Unbounded n-gram Language Models to a Trillion Tokensby Jiacheng Liu, Sewon Min, Luke…
Infini-gram: Scaling Unbounded n-gram Language Models to a Trillion Tokensby Jiacheng Liu, Sewon Min, Luke…
Navigating the OverKill in Large Language Modelsby Chenyu Shi, Xiao Wang, Qiming Ge, Songyang Gao,…
RomanSetu: Efficiently unlocking multilingual capabilities of Large Language Models via Romanizationby Jaavid Aktar Husain, Raj…
Inducing High Energy-Latency of Large Vision-Language Models with Verbose Imagesby Kuofeng Gao, Yang Bai, Jindong…
Accelerating Multilingual Language Model for Excessively Tokenized Languagesby Jimin Hong, Gibbeum Lee, Jaewoong ChoFirst submitted…
MICA: Towards Explainable Skin Lesion Diagnosis via Multi-Level Image-Concept Alignmentby Yequan Bie, Luyang Luo, Hao…
Beyond Sparse Rewards: Enhancing Reinforcement Learning with Language Model Critique in Text Generationby Meng Cao,…
APLe: Token-Wise Adaptive for Multi-Modal Prompt Learningby Guiming Cao, Kaize Shi, Hong Fu, Huaiwen Zhang,…
The Benefits of a Concise Chain of Thought on Problem-Solving in Large Language Modelsby Matthew…
DCR-Consistency: Divide-Conquer-Reasoning for Consistency Evaluation and Improvement of Large Language Modelsby Wendi Cui, Jiaxin Zhang,…