Summary of Shiftaddaug: Augment Multiplication-free Tiny Neural Network with Hybrid Computation, by Yipin Guo et al.
ShiftAddAug: Augment Multiplication-Free Tiny Neural Network with Hybrid Computation
by Yipin Guo, Zihao Li, Yilin Lang, Qinyuan Ren
First submitted to arxiv on: 3 Jul 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Artificial Intelligence (cs.AI); Computer Vision and Pattern Recognition (cs.CV)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The paper introduces ShiftAddAug, a novel approach that combines efficient but less powerful multiplication-free operators with costly multiplication to improve neural network (NN) performance. By augmenting these operators with multiplication, ShiftAddAug enhances accuracy without increasing inference overhead. The method employs a hybrid architecture, where a small ShiftAdd NN is embedded within a larger multiplicative model and trained as a sub-model for additional supervision. To address the weight discrepancy problem between hybrid operators, a new weight sharing method is proposed. Furthermore, a two-stage neural architecture search is used to obtain better augmentation effects for smaller but stronger multiplication-free tiny neural networks. The paper validates the superiority of ShiftAddAug through experiments in image classification and semantic segmentation, achieving notable enhancements. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary The paper ShiftAddAug improves the performance of neural networks by combining efficient operators with costly ones. It makes small neural networks work like big ones without using as much energy. This is done by training a small network inside a bigger one, which helps it learn better. The method also solves a problem where different parts of the network have different weights. To find the best way to combine these operators, the paper uses a special search process. The results show that ShiftAddAug works well in image classification and semantic segmentation tasks, even beating larger networks. |
Keywords
» Artificial intelligence » Image classification » Inference » Neural network » Semantic segmentation