Summary of One Model For One Graph: a New Perspective For Pretraining with Cross-domain Graphs, by Jingzhe Liu et al.
One Model for One Graph: A New Perspective for Pretraining with Cross-domain Graphsby Jingzhe Liu,…
One Model for One Graph: A New Perspective for Pretraining with Cross-domain Graphsby Jingzhe Liu,…
Scaling Particle Collision Data Analysisby Hengkui Wu, Panpan Chi, Yongfeng Zhu, Liujiang Liu, Shuyang Hu,…
Active Data Curation Effectively Distills Large-Scale Multimodal Modelsby Vishaal Udandarao, Nikhil Parthasarathy, Muhammad Ferjad Naeem,…
Instance-Aware Graph Prompt Learningby Jiazheng Li, Jundong Li, Chuxu ZhangFirst submitted to arxiv on: 26…
RECAST: Reparameterized, Compact weight Adaptation for Sequential Tasksby Nazia Tasnim, Bryan A. PlummerFirst submitted to…
Predicting Emergent Capabilities by Finetuningby Charlie Snell, Eric Wallace, Dan Klein, Sergey LevineFirst submitted to…
VICON: A Foundation Model for Multi-Physics Fluid Dynamics via Vision In-Context Operator Networksby Yadi Cao,…
Cautious Optimizers: Improving Training with One Line of Codeby Kaizhao Liang, Lizhang Chen, Bo Liu,…
The Zamba2 Suite: Technical Reportby Paolo Glorioso, Quentin Anthony, Yury Tokpanov, Anna Golubeva, Vasudev Shyam,…
Context-Aware Multimodal Pretrainingby Karsten Roth, Zeynep Akata, Dima Damen, Ivana Balažević, Olivier J. HénaffFirst submitted…