Summary of Convex Relaxations Of Relu Neural Networks Approximate Global Optima in Polynomial Time, by Sungyoon Kim et al.
Convex Relaxations of ReLU Neural Networks Approximate Global Optima in Polynomial Timeby Sungyoon Kim, Mert…
Convex Relaxations of ReLU Neural Networks Approximate Global Optima in Polynomial Timeby Sungyoon Kim, Mert…
Disparate Impact on Group Accuracy of Linearization for Private Inferenceby Saswat Das, Marco Romanelli, Ferdinando…
Stanceosaurus 2.0: Classifying Stance Towards Russian and Spanish Misinformationby Anton Lavrouk, Ian Ligon, Tarek Naous,…
Decentralized Sporadic Federated Learning: A Unified Algorithmic Framework with Convergence Guaranteesby Shahryar Zehtabi, Dong-Jun Han,…
Stochastic Modified Flows for Riemannian Stochastic Gradient Descentby Benjamin Gess, Sebastian Kassing, Nimit RanaFirst submitted…
Exact Tensor Completion Powered by Slim Transformsby Li Ge, Lin Chen, Yudong Chen, Xue JiangFirst…
The Information of Large Language Model Geometryby Zhiquan Tan, Chenghai Li, Weiran HuangFirst submitted to…
Rethinking the Role of Proxy Rewards in Language Model Alignmentby Sungdong Kim, Minjoon SeoFirst submitted…
Estimating Epistemic and Aleatoric Uncertainty with a Single Modelby Matthew A. Chan, Maria J. Molina,…
Trillion Parameter AI Serving Infrastructure for Scientific Discovery: A Survey and Visionby Nathaniel Hudson, J.…