Summary of Pldr-llm: Large Language Model From Power Law Decoder Representations, by Burc Gokden
PLDR-LLM: Large Language Model from Power Law Decoder Representationsby Burc GokdenFirst submitted to arxiv on:…
PLDR-LLM: Large Language Model from Power Law Decoder Representationsby Burc GokdenFirst submitted to arxiv on:…
Benchmarking Edge AI Platforms for High-Performance ML Inferenceby Rakshith Jayanth, Neelesh Gupta, Viktor PrasannaFirst submitted…
Enhanced Fault Detection and Cause Identification Using Integrated Attention Mechanismby Mohammad Ali Labbaf Khaniki, Alireza…
GvT: A Graph-based Vision Transformer with Talking-Heads Utilizing Sparsity, Trained from Scratch on Small Datasetsby…
HashAttention: Semantic Sparsity for Faster Inferenceby Aditya Desai, Shuo Yang, Alejandro Cuadron, Ana Klimovic, Matei…
Capturing the Temporal Dependence of Training Data Influenceby Jiachen T. Wang, Dawn Song, James Zou,…
Efficient Depth Estimation for Unstable Stereo Camera Systems on AR Glassesby Yongfan Liu, Hyoukjun KwonFirst…
Theoretically informed selection of latent activation in autoencoder based recommender systemsby Aviad SusmanFirst submitted to…
LASER: Attention with Exponential Transformationby Sai Surya Duvvuri, Inderjit S. DhillonFirst submitted to arxiv on:…
TokenSelect: Efficient Long-Context Inference and Length Extrapolation for LLMs via Dynamic Token-Level KV Cache Selectionby…