Summary of Spatially-aware Transformer For Embodied Agents, by Junmo Cho et al.
Spatially-Aware Transformer for Embodied Agentsby Junmo Cho, Jaesik Yoon, Sungjin AhnFirst submitted to arxiv on:…
Spatially-Aware Transformer for Embodied Agentsby Junmo Cho, Jaesik Yoon, Sungjin AhnFirst submitted to arxiv on:…
Entity-level Factual Adaptiveness of Fine-tuning based Abstractive Summarization Modelsby Jongyoon Song, Nohil Park, Bongkyu Hwang,…
Has the Deep Neural Network learned the Stochastic Process? An Evaluation Viewpointby Harshit Kumar, Beomseok…
Quantum Theory and Application of Contextual Optimal Transportby Nicola Mariella, Albert Akhriev, Francesco Tacchino, Christa…
tinyBenchmarks: evaluating LLMs with fewer examplesby Felipe Maia Polo, Lucas Weber, Leshem Choshen, Yuekai Sun,…
How Important Is Tokenization in French Medical Masked Language Models?by Yanis Labrak, Adrien Bazoge, Beatrice…
Comparison of Machine Learning Classification Algorithms and Application to the Framingham Heart Studyby Nabil KahouadjiFirst…
Towards Few-Shot Adaptation of Foundation Models via Multitask Finetuningby Zhuoyan Xu, Zhenmei Shi, Junyi Wei,…
Divide-or-Conquer? Which Part Should You Distill Your LLM?by Zhuofeng Wu, He Bai, Aonan Zhang, Jiatao…
Unintended Impacts of LLM Alignment on Global Representationby Michael J. Ryan, William Held, Diyi YangFirst…