Loading Now

Summary of Nash: Neural Architecture and Accelerator Search For Multiplication-reduced Hybrid Models, by Yang Xu et al.


NASH: Neural Architecture and Accelerator Search for Multiplication-Reduced Hybrid Models

by Yang Xu, Huihong Shi, Zhongfeng Wang

First submitted to arxiv on: 7 Sep 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
A novel approach to constructing deep neural networks (DNNs) that balances hardware efficiency and accuracy is proposed in this paper. The significant computational cost of multiplications hinders the deployment of DNNs on edge devices, while multiplication-free models typically sacrifice accuracy. To overcome these limitations, a Neural Architecture and Accelerator Search framework for multiplication-reduced Hybrid models (NASH) is introduced. NASH uses a tailored zero-shot metric to pre-identify promising hybrid models before training, alleviating gradient conflicts and enhancing search efficiency. The framework also seamlessly integrates accelerator search with NAS to unveil the optimal model and accelerator pairing. Experimental results demonstrate the effectiveness of NASH, achieving improved throughput, FPS, and accuracy compared to state-of-the-art multiplication-based systems on CIFAR-100 and Tiny-ImageNet datasets.
Low GrooveSquid.com (original content) Low Difficulty Summary
NASH is a new way to make deep neural networks work better on devices like smartphones or smart home devices. Right now, these devices have trouble running these complex networks because it takes too much computer power to do all the math needed. NASH finds a balance between speed and accuracy by using special techniques that help the network use less energy while still being pretty good at recognizing things like pictures of animals or objects. This means devices can run these networks faster, which is important for apps that need to make decisions quickly.

Keywords

» Artificial intelligence  » Zero shot