Loading Now

Summary of Sampling-based Pareto Optimization For Chance-constrained Monotone Submodular Problems, by Xiankun Yan et al.


Sampling-based Pareto Optimization for Chance-constrained Monotone Submodular Problems

by Xiankun Yan, Aneta Neumann, Frank Neumann

First submitted to arxiv on: 18 Apr 2024

Categories

  • Main: Artificial Intelligence (cs.AI)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper explores the application of chance-constrained optimization in evolutionary computation, focusing on Pareto optimization algorithms. Recently developed surrogate functions based on tail inequalities have been used with success in optimizing monotone submodular problems. However, there remains a gap in understanding the performance difference between algorithms using these surrogates and those employing direct sampling-based evaluation. To address this, the authors propose a new method for directly evaluating chance constraints, as well as an enhanced GSEMO algorithm that integrates an adaptive sliding window (ASW-GSEMO). The proposed ASW-GSEMO is tested on the maximum coverage problem with different settings, and its performance is compared to other algorithms using different surrogate functions. The results show that ASW-GSEMO outperforms these algorithms, demonstrating comparable performances across evaluation methods. Furthermore, the paper visualizes the behavior of ASW-GSEMO to highlight its advantages over traditional approaches.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper is about solving complex problems using computers and algorithms. It’s trying to find the best way to make sure that some things happen in a certain order. Some people have already tried this with different methods, but they haven’t been very good at it. So, the authors came up with new ideas and tested them on some big problems. They found that one of their methods worked really well and was better than other methods that were used before. They also showed how this new method works by drawing pictures to help explain what’s going on.

Keywords

» Artificial intelligence  » Optimization