Loading Now

Summary of Minimizing Ucb: a Better Local Search Strategy in Local Bayesian Optimization, by Zheyi Fan et al.


Minimizing UCB: a Better Local Search Strategy in Local Bayesian Optimization

by Zheyi Fan, Wenyu Wang, Szu Hui Ng, Qingpei Hu

First submitted to arxiv on: 24 May 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Optimization and Control (math.OC)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
A novel approach to solving high-dimensional black-box function optimization problems is presented, focusing on the approximated gradient class of methods that implement a strategy similar to gradient descent. These methods have demonstrated good experimental results and theoretical guarantees. However, by exploiting the distributional properties of Gaussian processes applied in these methods, it may be possible to further improve the Bayesian optimization search. The paper develops the relationship between the steps of gradient descent and minimizing Upper Confidence Bound (UCB), showing that the latter can be a better strategy when using a Gaussian process as a surrogate. A new local Bayesian optimization algorithm, MinUCB, is proposed by replacing the gradient descent step with minimizing UCB in GIBO, maintaining a similar convergence rate. Additionally, an improved acquisition function is introduced through a look-ahead strategy, resulting in LA-MinUCB. The algorithms are applied to various synthetic and real-world functions, showcasing their effectiveness.
Low GrooveSquid.com (original content) Low Difficulty Summary
Local Bayesian optimization uses approximated gradient methods that work like gradient descent. These methods have been tested and show promise. The paper looks at how using Gaussian processes can help improve the search. It finds that a different approach, minimizing Upper Confidence Bound (UCB), might be better. A new algorithm, MinUCB, is created by combining these ideas. This algorithm works well and is even improved with an additional idea. The results show that this method is effective for solving problems.

Keywords

» Artificial intelligence  » Gradient descent  » Optimization