Summary of Prompting a Pretrained Transformer Can Be a Universal Approximator, by Aleksandar Petrov et al.
Prompting a Pretrained Transformer Can Be a Universal Approximatorby Aleksandar Petrov, Philip H.S. Torr, Adel…
Prompting a Pretrained Transformer Can Be a Universal Approximatorby Aleksandar Petrov, Philip H.S. Torr, Adel…
2D Matryoshka Sentence Embeddingsby Xianming Li, Zongxi Li, Jing Li, Haoran Xie, Qing LiFirst submitted…
Linear Transformers are Versatile In-Context Learnersby Max Vladymyrov, Johannes von Oswald, Mark Sandler, Rong GeFirst…
BeTAIL: Behavior Transformer Adversarial Imitation Learning from Human Racing Gameplayby Catherine Weaver, Chen Tang, Ce…
Comparing Graph Transformers via Positional Encodingsby Mitchell Black, Zhengchao Wan, Gal Mishne, Amir Nayyeri, Yusu…
Do Efficient Transformers Really Save Computation?by Kai Yang, Jan Ackermann, Zhenyu He, Guhao Feng, Bohang…
Contextual Molecule Representation Learning from Chemical Reaction Knowledgeby Han Tang, Shikun Feng, Bicheng Lin, Yuyan…
An Explainable Transformer-based Model for Phishing Email Detection: A Large Language Model Approachby Mohammad Amaz…
From Self-Attention to Markov Models: Unveiling the Dynamics of Generative Transformersby M. Emrullah Ildiz, Yixiao…
FinGPT-HPC: Efficient Pretraining and Finetuning Large Language Models for Financial Applications with High-Performance Computingby Xiao-Yang…