Summary of Advnf: Reducing Mode Collapse in Conditional Normalising Flows Using Adversarial Learning, by Vikas Kanaujia and Mathias S. Scheurer and Vipul Arora
AdvNF: Reducing Mode Collapse in Conditional Normalising Flows using Adversarial Learning
by Vikas Kanaujia, Mathias S. Scheurer, Vipul Arora
First submitted to arxiv on: 29 Jan 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Statistical Mechanics (cond-mat.stat-mech); Computational Physics (physics.comp-ph)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary Deep generative models have emerged as a powerful tool for efficiently sampling from high-dimensional distributions, complementing traditional Markov-chain-Monte-Carlo methods. In particular, explicit generators like Normalising Flows (NFs) combined with the Metropolis Hastings algorithm have shown great promise in producing unbiased samples. However, conditional NFs still face significant challenges, including high variance, mode collapse, and data inefficiency. To address these issues, we propose an innovative approach: adversarial training for NFs. Our experiments demonstrate the effectiveness of this method on low-dimensional synthetic datasets and XY spin models in two spatial dimensions. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary Imagine you have a machine that can generate new images or music by learning from existing ones. But what if this machine gets stuck in a rut, generating only variations of the same old song? That’s essentially what happens when we use a type of computer model called a Normalising Flow (NF) to get samples from a target distribution. Our goal is to make NFs better at creating diverse and realistic samples by introducing an “adversary” that tries to trick the NF into generating more variety. We test our approach on simple datasets and spin models, showing it can help overcome common problems like high variance and mode collapse. |