Summary of A Unified Framework For Entropy Search and Expected Improvement in Bayesian Optimization, by Nuojin Cheng et al.
A Unified Framework for Entropy Search and Expected Improvement in Bayesian Optimization
by Nuojin Cheng, Leonard Papenmeier, Stephen Becker, Luigi Nardi
First submitted to arxiv on: 30 Jan 2025
Categories
- Main: Machine Learning (stat.ML)
- Secondary: Machine Learning (cs.LG); Optimization and Control (math.OC)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary Bayesian optimization for expensive black-box functions has seen significant advancements, with Expected Improvement (EI) being a widely used acquisition function. Information-theoretic acquisition functions aim to reduce uncertainty about the function’s optimum, often considered distinct from EI. This paper challenges this perspective by introducing Variational Entropy Search (VES), which reveals that EI and information-theoretic acquisition functions are more closely related than previously recognized. VES shows EI can be interpreted as a variational inference approximation of Max-value Entropy Search (MES). Building on this insight, the authors propose VES-Gamma, an acquisition function balancing EI and MES strengths. Extensive evaluations across synthetic and real-world benchmarks demonstrate that VES-Gamma is competitive with state-of-the-art methods, outperforming EI and MES in many cases. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary This paper is about a new way to find the best settings for complicated computer programs without having to test all of them. It’s called Bayesian optimization, and it helps us make better choices when we don’t know what will work best. The researchers found that two different methods, Expected Improvement and Max-value Entropy Search, are more related than people thought. They used this idea to create a new method, VES-Gamma, which works well on lots of different problems. |
Keywords
» Artificial intelligence » Inference » Optimization