Loading Now

Summary of Pomonag: Pareto-optimal Many-objective Neural Architecture Generator, by Eugenio Lomurno et al.


POMONAG: Pareto-Optimal Many-Objective Neural Architecture Generator

by Eugenio Lomurno, Samuele Mariani, Matteo Monti, Matteo Matteucci

First submitted to arxiv on: 30 Sep 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI); Computer Vision and Pattern Recognition (cs.CV)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper introduces a novel approach to Neural Architecture Search (NAS) that addresses the limitations of existing methods by considering multiple objectives simultaneously. Unlike previous state-of-the-art method DiffusionNAG, which only optimizes for accuracy, this paper’s Pareto-Optimal Many-Objective Neural Architecture Generator (POMONAG) considers factors like model complexity, computational efficiency, and inference latency, which are crucial for deploying models in resource-constrained environments. To achieve this, POMONAG uses a many-objective diffusion process that integrates Performance Predictor models to estimate these metrics and guide diffusion gradients. The proposed approach is evaluated on two search spaces (NASBench201 and MobileNetV3) and 15 image classification datasets, demonstrating improved performance and efficiency compared to the previous state-of-the-art.
Low GrooveSquid.com (original content) Low Difficulty Summary
POMONAG is a new way of designing neural networks that doesn’t just focus on making them good at one thing. It considers many different things at once, like how simple or complicated the network is, how much computer power it needs, and how fast it can make predictions. This makes it better for using in real-world situations where resources are limited. The approach uses a special kind of process called diffusion to find the best balance between these different objectives.

Keywords

» Artificial intelligence  » Diffusion  » Image classification  » Inference