Loading Now

Summary of Multi-objective Neural Architecture Search by Learning Search Space Partitions, By Yiyang Zhao et al.


Multi-Objective Neural Architecture Search by Learning Search Space Partitions

by Yiyang Zhao, Linnan Wang, Tian Guo

First submitted to arxiv on: 1 Jun 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper introduces a novel multi-objective optimizer called LaMOO that tackles the challenge of neural architecture search (NAS) in deep learning. The authors highlight the importance of considering metrics like model size, inference latency, and computational cost (#FLOPs) aside from traditional accuracy measures. To address the huge search space and non-negligible searching costs in NAS tasks, LaMOO employs a meta-algorithm that learns to partition the search space and focus on promising regions. This results in significant improvements in sample efficiency compared to existing methods like Bayesian optimization and evolutionary-based multi-objective optimizers. The authors demonstrate the effectiveness of LaMOO on various NAS datasets, including NasBench201 and CIFAR10, achieving improved accuracy with reduced computational costs.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper helps us build better deep learning models by finding the right balance between different goals. Traditionally, we just look at how well a model performs, but now we also need to think about things like how big it is and how fast it runs. To do this, researchers use something called multi-objective optimization. It’s like trying to find the best recipe for cookies – you want them to be delicious (high accuracy), easy to make (low computational cost), and not too many ingredients (small model size). The authors introduce a new way of doing this, called LaMOO, which is really good at finding great solutions quickly. They test it on some famous datasets and show that it works much better than other methods.

Keywords

» Artificial intelligence  » Deep learning  » Inference  » Optimization