Summary of Redistill: Residual Encoded Distillation For Peak Memory Reduction, by Fang Chen et al.
ReDistill: Residual Encoded Distillation for Peak Memory Reductionby Fang Chen, Gourav Datta, Mujahid Al Rafi,…
ReDistill: Residual Encoded Distillation for Peak Memory Reductionby Fang Chen, Gourav Datta, Mujahid Al Rafi,…
Convolutional Neural Networks and Vision Transformers for Fashion MNIST Classification: A Literature Reviewby Sonia Bbouzidi,…
FusionBench: A Comprehensive Benchmark of Deep Model Fusionby Anke Tang, Li Shen, Yong Luo, Han…
Tiny models from tiny data: Textual and null-text inversion for few-shot distillationby Erik Landolsi, Fredrik…
GrootVL: Tree Topology is All You Need in State Space Modelby Yicheng Xiao, Lin Song,…
FFNet: MetaMixer-based Efficient Convolutional Mixer Designby Seokju Yun, Dongheon Lee, Youngmin RoFirst submitted to arxiv…
CoLa-DCE – Concept-guided Latent Diffusion Counterfactual Explanationsby Franz Motzkus, Christian Hellert, Ute SchmidFirst submitted to…
MultiMax: Sparse and Multi-Modal Attention Learningby Yuxuan Zhou, Mario Fritz, Margret KeuperFirst submitted to arxiv…
Non-Federated Multi-Task Split Learning for Heterogeneous Sourcesby Yilin Zheng, Atilla EryilmazFirst submitted to arxiv on:…
Enhancing Counterfactual Image Generation Using Mahalanobis Distance with Distribution Preferences in Feature Spaceby Yukai Zhang,…