Loading Now

Summary of Shrinking Your Timestep: Towards Low-latency Neuromorphic Object Recognition with Spiking Neural Network, by Yongqi Ding et al.


Shrinking Your TimeStep: Towards Low-Latency Neuromorphic Object Recognition with Spiking Neural Network

by Yongqi Ding, Lin Zuo, Mengmeng Jing, Pei He, Yongjun Xiao

First submitted to arxiv on: 2 Jan 2024

Categories

  • Main: Computer Vision and Pattern Recognition (cs.CV)
  • Secondary: Machine Learning (cs.LG); Image and Video Processing (eess.IV)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper proposes the Shrinking Spiking Neural Network (SSNN), a novel approach to achieve low-latency neuromorphic object recognition without compromising performance. Existing spiking neural networks (SNNs) suffer from significant latency, taking 10-40 timesteps or more to recognize objects. To alleviate this issue, SSNN divides SNNs into multiple stages with progressively shrinking timesteps, reducing inference latency while preserving information. The paper also introduces early classifiers during training to mitigate the mismatch between surrogate and true gradients, preventing performance degradation at low latency. Experimental results on neuromorphic datasets show that SSNN improves baseline accuracy by 6.55% to 21.41%, achieving an accuracy of 73.63% on CIFAR10-DVS with only 5 average timesteps.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper is all about creating a new way for computers to recognize objects, called the Shrinking Spiking Neural Network (SSNN). The current method takes too long and gets less accurate when it tries to recognize things quickly. SSNN makes it faster by breaking it down into smaller parts that get shorter and shorter, like a staircase. This helps the computer recognize things quickly without losing accuracy. It also adds special helpers during training to make sure it doesn’t forget what it learned. The results show that this new way works really well on different types of data.

Keywords

* Artificial intelligence  * Inference  * Neural network