Summary of Ptt5-v2: a Closer Look at Continued Pretraining Of T5 Models For the Portuguese Language, by Marcos Piau et al.
ptt5-v2: A Closer Look at Continued Pretraining of T5 Models for the Portuguese Languageby Marcos…
ptt5-v2: A Closer Look at Continued Pretraining of T5 Models for the Portuguese Languageby Marcos…
SuperPos-Prompt: Enhancing Soft Prompt Tuning of Language Models with Superposition of Multi Token Embeddingsby MohammadAli…
FOCUS: Forging Originality through Contrastive Use in Self-Plagiarism for Language Modelsby Kaixin Lan, Tao Fang,…
LLMs Meet Multimodal Generation and Editing: A Surveyby Yingqing He, Zhaoyang Liu, Jingye Chen, Zeyue…
Zero-Shot Spam Email Classification Using Pre-trained Large Language Modelsby Sergio Rojas-GaleanoFirst submitted to arxiv on:…
Pre-Calc: Learning to Use the Calculator Improves Numeracy in Language Modelsby Vishruth Veerendranath, Vishwa Shah,…
Augmenting emotion features in irony detection with Large language modelingby Yucheng Lin, Yuhan Xia, Yunfei…
On Linearizing Structured Data in Encoder-Decoder Language Models: Insights from Text-to-SQLby Yutong Shao, Ndapa NakasholeFirst…
Mitigating Misleading Chain-of-Thought Reasoning with Selective Filteringby Yexin Wu, Zhuosheng Zhang, Hai ZhaoFirst submitted to…
Reshaping Free-Text Radiology Notes Into Structured Reports With Generative Transformersby Laura Bergomi, Tommaso M. Buonocore,…