Loading Now

Summary of Partial Distribution Matching Via Partial Wasserstein Adversarial Networks, by Zi-ming Wang et al.


Partial Distribution Matching via Partial Wasserstein Adversarial Networks

by Zi-Ming Wang, Nan Xue, Ling Lei, Rebecka Jörnsten, Gui-Song Xia

First submitted to arxiv on: 16 Sep 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Machine Learning (stat.ML)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The paper studies distribution matching, a fundamental machine learning problem seeking to align two probability distributions. The authors propose partial distribution matching (PDM), which seeks to match a fraction of the distributions instead of matching them completely. They derive the Kantorovich-Rubinstein duality for the partial Wasserstein-1 discrepancy and develop a partial Wasserstein adversarial network (PWAN) that approximates this discrepancy. The authors demonstrate the effectiveness of their approach on two practical tasks: point set registration and partial domain adaptation. Their results show that the proposed PWAN produces highly robust matching results, comparable to state-of-the-art methods.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper is about a big problem in machine learning called distribution matching. It’s like trying to match two puzzles together, but instead of pictures, it’s probability distributions. The authors have a new way to do this, which they call partial distribution matching (PDM). They show that their method works well on two real-world tasks: aligning 3D shapes and adapting computer vision systems for different datasets.

Keywords

» Artificial intelligence  » Domain adaptation  » Machine learning  » Probability