Summary of Less Is More: a Simple Yet Effective Token Reduction Method For Efficient Multi-modal Llms, by Dingjie Song et al.
Less is More: A Simple yet Effective Token Reduction Method for Efficient Multi-modal LLMsby Dingjie…
Less is More: A Simple yet Effective Token Reduction Method for Efficient Multi-modal LLMsby Dingjie…
DynamicNER: A Dynamic, Multilingual, and Fine-Grained Dataset for LLM-based Named Entity Recognitionby Hanjun Luo, Yingbin…
A Comprehensive Evaluation of Quantized Instruction-Tuned Large Language Models: An Experimental Analysis up to 405Bby…
RoMath: A Mathematical Reasoning Benchmark in Romanianby Adrian Cosma, Ana-Maria Bucur, Emilian RadoiFirst submitted to…
Diversity-grounded Channel Prototypical Learning for Out-of-Distribution Intent Detectionby Bo Liu, Liming Zhan, Yujie Feng, Zexin…
Improving the Efficiency of Visually Augmented Language Modelsby Paula Ontalvilla, Aitor Ormazabal, Gorka AzkuneFirst submitted…
Fast Analysis of the OpenAI O1-Preview Model in Solving Random K-SAT Problem: Does the LLM…
Machine Learning and Theory Ladenness – A Phenomenological Accountby Alberto Termine, Emanuele Ratti, Alessandro FacchiniFirst…
Task Arithmetic for Language Expansion in Speech Translationby Yao-Fei Cheng, Hayato Futami, Yosuke Kashiwagi, Emiru…
Zero-resource Hallucination Detection for Text Generation via Graph-based Contextual Knowledge Triples Modelingby Xinyue Fang, Zhen…