Summary of Revealing the Utilized Rank Of Subspaces Of Learning in Neural Networks, by Isha Garg et al.
Revealing the Utilized Rank of Subspaces of Learning in Neural Networksby Isha Garg, Christian Koguchi,…
Revealing the Utilized Rank of Subspaces of Learning in Neural Networksby Isha Garg, Christian Koguchi,…
GPT vs RETRO: Exploring the Intersection of Retrieval and Parameter-Efficient Fine-Tuningby Aleksander Ficek, Jiaqi Zeng,…
Quantifying Prediction Consistency Under Model Multiplicity in Tabular LLMsby Faisal Hamman, Pasan Dissanayake, Saumitra Mishra,…
Leveraging Latent Diffusion Models for Training-Free In-Distribution Data Augmentation for Surface Defect Detectionby Federico Girella,…
ROER: Regularized Optimal Experience Replayby Changling Li, Zhang-Wei Hong, Pulkit Agrawal, Divyansh Garg, Joni PajarinenFirst…
DotaMath: Decomposition of Thought with Code Assistance and Self-correction for Mathematical Reasoningby Chengpeng Li, Guanting…
Domain-Aware Fine-Tuning of Foundation Modelsby Ugur Ali Kaplan, Margret Keuper, Anna Khoreva, Dan Zhang, Yumeng…
A Survey of Data Synthesis Approachesby Hsin-Yu Chang, Pei-Yu Chen, Tun-Hsiang Chou, Chang-Sheng Kao, Hsuan-Yun…
An Efficient Framework for Crediting Data Contributors of Diffusion Modelsby Chris Lin, Mingyu Lu, Chanwoo…
Self-Evaluation as a Defense Against Adversarial Attacks on LLMsby Hannah Brown, Leon Lin, Kenji Kawaguchi,…