Loading Now

Summary of Tabconv: Low-computation Cnn Inference Via Table Lookups, by Neelesh Gupta et al.


TabConv: Low-Computation CNN Inference via Table Lookups

by Neelesh Gupta, Narayanan Kannan, Pengmiao Zhang, Viktor Prasanna

First submitted to arxiv on: 8 Apr 2024

Categories

  • Main: Computer Vision and Pattern Recognition (cs.CV)
  • Secondary: Machine Learning (cs.LG); Neural and Evolutionary Computing (cs.NE)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
A novel table-based approximation called TabConv is proposed to reduce arithmetic operations during convolutional neural network (CNN) inference. This approach leverages algorithmic acceleration and approximate matrix multiplication to bridge the gap between hardware and software. A priority masking technique based on cosine similarity is introduced to select layers for table-based approximation, maintaining model performance. The effectiveness of TabConv is evaluated on popular CNNs such as ResNet-18, ResNet-34, and NetworkInNetwork (NIN) using datasets like CIFAR-10, CIFAR-100, MNIST, achieving low-computation inference while preserving over 93% of the original model’s performance. This approach has significant implications for deploying CNNs in hardware-efficient manners.
Low GrooveSquid.com (original content) Low Difficulty Summary
TabConv is a new way to make convolutional neural networks (CNNs) run faster on computers. It uses tables instead of complicated math problems to do calculations, which makes it really fast and efficient. The idea behind TabConv is to choose which parts of the network to use this table-based method for, so that the overall performance of the network isn’t affected too much. The researchers tested their new approach with three different types of networks (ResNet-18, ResNet-34, and NIN) on various datasets like CIFAR-10, CIFAR-100, MNIST, and it worked really well! It can speed up the calculations by a lot without sacrificing accuracy.

Keywords

* Artificial intelligence  * Cnn  * Cosine similarity  * Inference  * Neural network  * Resnet