Summary of Smiles-mamba: Chemical Mamba Foundation Models For Drug Admet Prediction, by Bohao Xu et al.
SMILES-Mamba: Chemical Mamba Foundation Models for Drug ADMET Predictionby Bohao Xu, Yingzhou Lu, Chenhao Li,…
SMILES-Mamba: Chemical Mamba Foundation Models for Drug ADMET Predictionby Bohao Xu, Yingzhou Lu, Chenhao Li,…
PowerPM: Foundation Model for Power Systemsby Shihao Tu, Yupeng Zhang, Jing Zhang, Zhendong Fu, Yin…
Is Child-Directed Speech Effective Training Data for Language Models?by Steven Y. Feng, Noah D. Goodman,…
Hybrid diffusion models: combining supervised and generative pretraining for label-efficient fine-tuning of segmentation modelsby Bruno…
Scaling LLM Test-Time Compute Optimally can be More Effective than Scaling Model Parametersby Charlie Snell,…
Attenuation-adjusted deep learning of pore defects in 2D radiographs of additive manufacturing powdersby Andreas Bjerregaard,…
A Causally Informed Pretraining Approach for Multimodal Foundation Models: Applications in Remote Sensingby Praveen Ravirathinam,…
Embedding And Clustering Your Data Can Improve Contrastive Pretrainingby Luke MerrickFirst submitted to arxiv on:…
HVM-1: Large-scale video models pretrained with nearly 5000 hours of human-like video databy A. Emin…
Pretraining a Neural Operator in Lower Dimensionsby AmirPouya Hemmasian, Amir Barati FarimaniFirst submitted to arxiv…