Summary of Learning World Models with Hierarchical Temporal Abstractions: a Probabilistic Perspective, by Vaisakh Shaj
Learning World Models With Hierarchical Temporal Abstractions: A Probabilistic Perspectiveby Vaisakh ShajFirst submitted to arxiv…
Learning World Models With Hierarchical Temporal Abstractions: A Probabilistic Perspectiveby Vaisakh ShajFirst submitted to arxiv…
GeckOpt: LLM System Efficiency via Intent-Based Tool Selectionby Michael Fore, Simranjit Singh, Dimitrios StamoulisFirst submitted…
ST-MambaSync: The Complement of Mamba and Transformers for Spatial-Temporal in Traffic Flow Predictionby Zhiqi Shao,…
Neural Proto-Language Reconstructionby Chenxuan Cui, Ying Chen, Qinxin Wang, David R. MortensenFirst submitted to arxiv…
Deep Models for Multi-View 3D Object Recognition: A Reviewby Mona Alzahrani, Muhammad Usman, Salma Kammoun,…
SMPLer: Taming Transformers for Monocular 3D Human Shape and Pose Estimationby Xiangyu Xu, Lijuan Liu,…
Mamba3D: Enhancing Local Features for 3D Point Cloud Analysis via State Space Modelby Xu Han,…
SST: Multi-Scale Hybrid Mamba-Transformer Experts for Long-Short Range Time Series Forecastingby Xiongxiao Xu, Canyu Chen,…
Time-aware Heterogeneous Graph Transformer with Adaptive Attention Merging for Health Event Predictionby Shibo Li, Hengliang…
Towards smaller, faster decoder-only transformers: Architectural variants and their implicationsby Sathya Krishnan Suresh, Shunmugapriya PFirst…