Summary of Transformers to Predict the Applicability Of Symbolic Integration Routines, by Rashid Barket et al.
Transformers to Predict the Applicability of Symbolic Integration Routinesby Rashid Barket, Uzma Shafiq, Matthew England,…
Transformers to Predict the Applicability of Symbolic Integration Routinesby Rashid Barket, Uzma Shafiq, Matthew England,…
TrAct: Making First-layer Pre-Activations Trainableby Felix Petersen, Christian Borgelt, Stefano ErmonFirst submitted to arxiv on:…
LSEAttention is All You Need for Time Series Forecastingby Dizhen LiangFirst submitted to arxiv on:…
Return Augmented Decision Transformer for Off-Dynamics Reinforcement Learningby Ruhan Wang, Yu Yang, Zhishuai Liu, Dongruo…
The Belief State Transformerby Edward S. Hu, Kwangjun Ahn, Qinghua Liu, Haoran Xu, Manan Tomar,…
Generative forecasting of brain activity enhances Alzheimer’s classification and interpretationby Yutong Gao, Vince D. Calhoun,…
Learning and Transferring Sparse Contextual Bigrams with Linear Transformersby Yunwei Ren, Zixuan Wang, Jason D.…
TokenFormer: Rethinking Transformer Scaling with Tokenized Model Parametersby Haiyang Wang, Yue Fan, Muhammad Ferjad Naeem,…
Does equivariance matter at scale?by Johann Brehmer, Sönke Behrends, Pim de Haan, Taco CohenFirst submitted…
ProTransformer: Robustify Transformers via Plug-and-Play Paradigmby Zhichao Hou, Weizhi Gao, Yuchen Shen, Feiyi Wang, Xiaorui…