Summary of From Symbolic Tasks to Code Generation: Diversification Yields Better Task Performers, by Dylan Zhang et al.
From Symbolic Tasks to Code Generation: Diversification Yields Better Task Performersby Dylan Zhang, Justin Wang,…
From Symbolic Tasks to Code Generation: Diversification Yields Better Task Performersby Dylan Zhang, Justin Wang,…
SAM-E: Leveraging Visual Foundation Model with Sequence Imitation for Embodied Manipulationby Junjie Zhang, Chenjia Bai,…
PureEBM: Universal Poison Purification via Mid-Run Dynamics of Energy-Based Modelsby Omead Pooladzandi, Jeffrey Jiang, Sunay…
Deep Grokking: Would Deep Neural Networks Generalize Better?by Simin Fan, Razvan Pascanu, Martin JaggiFirst submitted…
Weak-to-Strong Search: Align Large Language Models via Searching over Small Language Modelsby Zhanhui Zhou, Zhixuan…
MANO: Exploiting Matrix Norm for Unsupervised Accuracy Estimation Under Distribution Shiftsby Renchunzi Xie, Ambroise Odonnat,…
FedMAP: Unlocking Potential in Personalized Federated Learning through Bi-Level MAP Optimizationby Fan Zhang, Carlos Esteve-Yagüe,…
Physics-Aware Neural Implicit Solvers for multiscale, parametric PDEs with applications in heterogeneous mediaby Matthaios Chatzopoulos,…
Domain-Inspired Sharpness-Aware Minimization Under Domain Shiftsby Ruipeng Zhang, Ziqing Fan, Jiangchao Yao, Ya Zhang, Yanfeng…
MOKD: Cross-domain Finetuning for Few-shot Classification via Maximizing Optimized Kernel Dependenceby Hongduan Tian, Feng Liu,…