Loading Now

Summary of Queen: a Large Language Model For Quechua-english Translation, by Junhao Chen et al.


QueEn: A Large Language Model for Quechua-English Translation

by Junhao Chen, Peng Shu, Yiwei Li, Huaqin Zhao, Hanqi Jiang, Yi Pan, Yifan Zhou, Zhengliang Liu, Lewis C Howe, Tianming Liu

First submitted to arxiv on: 6 Dec 2024

Categories

  • Main: Computation and Language (cs.CL)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper proposes QueEn, a novel approach for translating Quechua to English that combines Retrieval-Augmented Generation (RAG) with parameter-efficient fine-tuning techniques. The method leverages external linguistic resources through RAG and uses Low-Rank Adaptation (LoRA) for efficient model adaptation. Experimental results show that QueEn substantially exceeds baseline models, achieving a BLEU score of 17.6 compared to 1.5 for standard GPT models. This approach addresses the challenges of low-resource language translation while maintaining computational efficiency.
Low GrooveSquid.com (original content) Low Difficulty Summary
Quechua is an ancient language spoken in South America, but it’s hard to translate because there’s not much data available and it has cultural nuances that are tricky to understand. Researchers have developed big language models that can help with many tasks, but they struggle when working with small languages like Quechua. This paper presents a new way to translate Quechua into English using a combination of old and new techniques. The method uses information from other sources to help the model learn and adapts it in a way that’s efficient and accurate.

Keywords

» Artificial intelligence  » Bleu  » Fine tuning  » Gpt  » Lora  » Low rank adaptation  » Parameter efficient  » Rag  » Retrieval augmented generation  » Translation