Summary of Stable and Robust Deep Learning by Hyperbolic Tangent Exponential Linear Unit (telu), By Alfredo Fernandez and Ankur Mali
Stable and Robust Deep Learning By Hyperbolic Tangent Exponential Linear Unit (TeLU)by Alfredo Fernandez, Ankur…
Stable and Robust Deep Learning By Hyperbolic Tangent Exponential Linear Unit (TeLU)by Alfredo Fernandez, Ankur…
A Momentum Accelerated Algorithm for ReLU-based Nonlinear Matrix Decompositionby Qingsong Wang, Chunfeng Cui, Deren HanFirst…
Task structure and nonlinearity jointly determine learned representational geometryby Matteo Alleman, Jack W Lindsey, Stefano…
UR4NNV: Neural Network Verification, Under-approximation Reachability Works!by Zhen Liang, Taoran Wu, Ran Zhao, Bai Xue,…
Extracting Formulae in Many-Valued Logic from Deep Neural Networksby Yani Zhang, Helmut BölcskeiFirst submitted to…
A Characterization Theorem for Equivariant Networks with Point-wise Activationsby Marco Pacini, Xiaowen Dong, Bruno Lepri,…
GD doesn’t make the cut: Three ways that non-differentiability affects neural network trainingby Siddharth Krishna…
Universal Consistency of Wide and Deep ReLU Neural Networks and Minimax Optimal Convergence Rates for…
Nonlinear functional regression by functional deep neural network with kernel embeddingby Zhongjie Shi, Jun Fan,…
Neural Collapse for Cross-entropy Class-Imbalanced Learning with Unconstrained ReLU Feature Modelby Hien Dang, Tho Tran,…