Loading Now

Summary of Ovsw: Overcoming Silent Weights For Accurate Binary Neural Networks, by Jingyang Xiang et al.


OvSW: Overcoming Silent Weights for Accurate Binary Neural Networks

by Jingyang Xiang, Zuohui Chen, Siqi Li, Qing Wu, Yong Liu

First submitted to arxiv on: 7 Jul 2024

Categories

  • Main: Computer Vision and Pattern Recognition (cs.CV)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
Binary Neural Networks (BNNs) have been shown to be highly effective for deploying deep neural networks on mobile and embedded platforms. While most existing works focus on minimizing quantization errors, improving representation ability, or designing gradient approximations, this paper investigates the efficiency of weight sign updates in BNNs. The authors observe that over 50% of weights remain unchanged during training, which they refer to as “silent weights.” These silent weights slow down convergence and lead to a significant accuracy degradation. To address this issue, the authors propose Overcome Silent Weights (OvSW), which uses Adaptive Gradient Scaling (AGS) to establish a relationship between the gradient and the latent weight distribution, improving the overall efficiency of weight sign updates. Additionally, Silence Awareness Decaying (SAD) is designed to automatically identify “silent weights” by tracking weight flipping state, applying an additional penalty to facilitate their flipping. The method achieves faster convergence and state-of-the-art performance on CIFAR10 and ImageNet1K datasets with various architectures.
Low GrooveSquid.com (original content) Low Difficulty Summary
BNNs are a type of deep neural network that can be deployed on mobile and embedded platforms. Most BNNs focus on minimizing errors or improving representation ability, but this paper looks at how well they update their weights. The authors found that most weights don’t change during training, which slows down the network’s learning. They proposed a new way to update these “silent” weights, called OvSW, which helps the network learn faster and more accurately.

Keywords

» Artificial intelligence  » Neural network  » Quantization  » Tracking