Loading Now

Summary of Achieving Pareto Optimality Using Efficient Parameter Reduction For Dnns in Resource-constrained Edge Environment, by Atah Nuh Mih et al.


Achieving Pareto Optimality using Efficient Parameter Reduction for DNNs in Resource-Constrained Edge Environment

by Atah Nuh Mih, Alireza Rahimi, Asfia Kawnine, Francis Palma, Monica Wachowicz, Rickey Dubay, Hung Cao

First submitted to arxiv on: 14 Mar 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI); Computer Vision and Pattern Recognition (cs.CV)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
A medium-difficulty summary: This paper proposes an optimization technique for Xception, a Deep Neural Network (DNN), to improve its hardware utilization and facilitate on-device training in resource-constrained edge environments. The proposed method achieves efficient parameter reduction strategies without sacrificing accuracy, reducing memory utilization during training. The authors evaluate the optimized model in two experiments: Caltech-101 image classification and PCB defect detection, comparing it with the original Xception, EfficientNetV2B1, and MobileNetV2. The results show that the optimized Xception has better test accuracy and faster training/inference times than the original Xception, while using less memory on average. In contrast, lightweight models overfit, demonstrating the benefits of the proposed optimization technique.
Low GrooveSquid.com (original content) Low Difficulty Summary
A low-difficulty summary: This paper is about making computers learn more efficiently without using too many resources. The researchers improved a special kind of computer program called a Deep Neural Network (DNN) that can be used on devices like smartphones or smart home appliances. They made the program smaller and faster by removing some parts that weren’t essential, so it uses less memory and energy. The team tested this new version with two different tasks: recognizing objects in pictures and detecting defects in electronics. The results show that their optimized DNN performed better than the original one and used even less resources.

Keywords

* Artificial intelligence  * Image classification  * Inference  * Neural network  * Optimization