Loading Now

Summary of Phasor-driven Acceleration For Fft-based Cnns, by Eduardo Reis et al.


Phasor-Driven Acceleration for FFT-based CNNs

by Eduardo Reis, Thangarajah Akilan, Mohammed Khalid

First submitted to arxiv on: 1 Jun 2024

Categories

  • Main: Computer Vision and Pattern Recognition (cs.CV)
  • Secondary: Machine Learning (cs.LG); Signal Processing (eess.SP)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper proposes a novel approach to accelerate Convolutional Neural Networks (CNNs) by replacing spatial convolution with element-wise multiplications on the spectral domain using phasor form, a polar representation of complex numbers. The traditional rectangular form used in modern CNN architectures is replaced, achieving superior speed improvements during training and inference. Experimental results demonstrate average speed improvements of 1.316 during training and 1.321 during inference on CIFAR-10, and similar results are achieved on CIFAR-100 with an average improvement of 1.299 during training and 1.300 during inference.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper finds a way to make Convolutional Neural Networks (CNNs) work faster by using a new way to do math called phasor form. Right now, CNNs use a old-fashioned method that involves lots of complicated calculations. The researchers in this paper discovered a shortcut that makes these calculations much faster. They tested their idea on two big datasets: CIFAR-10 and CIFAR-100. Their results show that their new way is up to 36% faster during training and inference.

Keywords

» Artificial intelligence  » Cnn  » Inference