Summary of Enabling On-device Continual Learning with Binary Neural Networks, by Lorenzo Vorabbi et al.
Enabling On-device Continual Learning with Binary Neural Networksby Lorenzo Vorabbi, Davide Maltoni, Guido Borghi, Stefano…
Enabling On-device Continual Learning with Binary Neural Networksby Lorenzo Vorabbi, Davide Maltoni, Guido Borghi, Stefano…
Activations and Gradients Compression for Model-Parallel Trainingby Mikhail Rudakov, Aleksandr Beznosikov, Yaroslav Kholodov, Alexander GasnikovFirst…
Extreme Compression of Large Language Models via Additive Quantizationby Vage Egiazarian, Andrei Panferov, Denis Kuznedelev,…
EDA-DM: Enhanced Distribution Alignment for Post-Training Quantization of Diffusion Modelsby Xuewen Liu, Zhikai Li, Junrui…
HQ-VAE: Hierarchical Discrete Representation Learning with Variational Bayesby Yuhta Takida, Yukara Ikemiya, Takashi Shibuya, Kazuki…
Compressing LLMs: The Truth is Rarely Pure and Never Simpleby Ajay Jaiswal, Zhe Gan, Xianzhi…
FlatENN: Train Flat for Enhanced Fault Tolerance of Quantized Deep Neural Networksby Akul Malhotra, Sumeet…
Semantic and Effective Communication for Remote Control Tasks with Dynamic Feature Compressionby Pietro Talli, Francesco…
On-Device Training Under 256KB Memoryby Ji Lin, Ligeng Zhu, Wei-Ming Chen, Wei-Chen Wang, Chuang Gan,…