Loading Now

Summary of Batch and Match: Black-box Variational Inference with a Score-based Divergence, by Diana Cai et al.


Batch and match: black-box variational inference with a score-based divergence

by Diana Cai, Chirag Modi, Loucas Pillaud-Vivien, Charles C. Margossian, Robert M. Gower, David M. Blei, Lawrence K. Saul

First submitted to arxiv on: 22 Feb 2024

Categories

  • Main: Machine Learning (stat.ML)
  • Secondary: Artificial Intelligence (cs.AI); Machine Learning (cs.LG); Computation (stat.CO)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The paper proposes a new approach to black-box variational inference (BBVI), called batch and match (BaM). Traditional BBVI methods optimize the stochastic evidence lower bound (ELBO), but these methods often converge slowly due to high variance in gradient estimates and sensitivity to hyperparameters. BaM, on the other hand, uses a score-based divergence that can be optimized with a closed-form proximal update for Gaussian variational families with full covariance matrices. The paper analyzes the convergence of BaM when targeting a Gaussian distribution and proves that it converges exponentially quickly in the limit of infinite batch size. It also evaluates BaM’s performance on various target distributions, including those from hierarchical and deep generative models. Results show that BaM typically converges faster than traditional BBVI methods.
Low GrooveSquid.com (original content) Low Difficulty Summary
The paper is about finding a way to make a type of machine learning called black-box variational inference (BBVI) work better. Right now, BBVI can be slow because it’s hard to get the right information from the data. The researchers created a new method called batch and match (BaM), which uses a different way to find the correct information. They tested BaM on some examples and found that it usually works faster than other methods.

Keywords

* Artificial intelligence  * Inference  * Machine learning