Loading Now

Summary of Till the Layers Collapse: Compressing a Deep Neural Network Through the Lenses Of Batch Normalization Layers, by Zhu Liao et al.


Till the Layers Collapse: Compressing a Deep Neural Network through the Lenses of Batch Normalization Layers

by Zhu Liao, Nour Hezbri, Victor Quétu, Van-Tam Nguyen, Enzo Tartaglione

First submitted to arxiv on: 19 Dec 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Computation and Language (cs.CL); Computer Vision and Pattern Recognition (cs.CV)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The proposed TLC method compresses deep neural networks by reducing the depth of batch normalization layers, decreasing computational requirements and latency. The approach is validated on popular models like Swin-T, MobileNet-V2, and RoBERTa across image classification and NLP tasks.
Low GrooveSquid.com (original content) Low Difficulty Summary
TLC helps reduce the size of large neural networks, making them more efficient to use. This method is tested on famous models like Swin-T, MobileNet-V2, and RoBERTa for both picture recognition and language processing tasks.

Keywords

» Artificial intelligence  » Batch normalization  » Image classification  » Nlp