Summary of Qos-nets: Adaptive Approximate Neural Network Inference, by Elias Trommer et al.
QoS-Nets: Adaptive Approximate Neural Network Inference
by Elias Trommer, Bernd Waschneck, Akash Kumar
First submitted to arxiv on: 10 Oct 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: None
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The proposed flexible reuse of approximate multipliers for neural network layer computations enables varying arithmetic resource consumption at runtime, allowing for gradual adaptation to changing environmental conditions. A search algorithm selects an optimal subset of approximate multipliers from a larger search space and enables retraining to maximize task performance. Unlike previous work, QoS-Nets outputs multiple static assignments of approximate multiplier instances to layers, enabling different operating points that balance accuracy and resource consumption. The approach is evaluated on MobileNetV2, achieving power savings for multiplications between 15.3% and 42.8% at a Top-5 accuracy loss between 0.3 and 2.33 percentage points. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary This paper helps computers use less energy by adjusting how they do math problems. It does this by using special “approximate multipliers” that are good for some tasks but not others. The system can choose the right multiplier for the job and even change it during use to save energy or improve performance. This is useful because it lets devices adapt to changing conditions, like when someone’s holding them and the battery is running low. |
Keywords
» Artificial intelligence » Neural network