Summary of Development Of Pre-trained Transformer-based Models For the Nepali Language, by Prajwal Thapa et al.
Development of Pre-Trained Transformer-based Models for the Nepali Languageby Prajwal Thapa, Jinu Nyachhyon, Mridul Sharma,…
Development of Pre-Trained Transformer-based Models for the Nepali Languageby Prajwal Thapa, Jinu Nyachhyon, Mridul Sharma,…
A Comparative Analysis of Transformer and LSTM Models for Detecting Suicidal Ideation on Redditby Khalid…
AI Foundation Models for Wearable Movement Data in Mental Health Researchby Franklin Y. Ruan, Aiwei…
The Zamba2 Suite: Technical Reportby Paolo Glorioso, Quentin Anthony, Yury Tokpanov, Anna Golubeva, Vasudev Shyam,…
BanglaEmbed: Efficient Sentence Embedding Models for a Low-Resource Language Using Cross-Lingual Distillation Techniquesby Muhammad Rafsan…
ElastiFormer: Learned Redundancy Reduction in Transformer via Self-Distillationby Junzhang Liu, Tingkai Liu, Yueyuan Sui, Stephen…
Transforming NLU with Babylon: A Case Study in Development of Real-time, Edge-Efficient, Multi-Intent Translation System…
Parameter Efficient Mamba Tuning via Projector-targeted Diagonal-centric Linear Transformationby Seokil Ham, Hee-Seon Kim, Sangmin Woo,…
Evaluating Vision Transformer Models for Visual Quality Control in Industrial Manufacturingby Miriam Alber, Christoph Hönes,…
RED: Effective Trajectory Representation Learning with Comprehensive Informationby Silin Zhou, Shuo Shang, Lisi Chen, Christian…