Loading Now

Summary of Gaussian-smoothed Sliced Probability Divergences, by Mokhtar Z. Alaya (lmac) et al.


Gaussian-Smoothed Sliced Probability Divergences

by Mokhtar Z. Alaya, Alain Rakotomamonjy, Maxime Berar, Gilles Gasso

First submitted to arxiv on: 4 Apr 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Statistics Theory (math.ST); Machine Learning (stat.ML)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The paper introduces Gaussian-smoothed sliced Wasserstein distance for comparing probability distributions while preserving privacy on data. This metric has been shown to provide performance similar to its non-private counterpart. However, its computational and statistical properties have not yet been well-established. The authors investigate the theoretical properties of this distance and generalized versions, denoted as Gaussian-smoothed sliced divergences. They show that smoothing and slicing preserve the metric property and weak topology. The paper also introduces a double empirical distribution for the smoothed-projected origin distribution and proves the convergence rate of Gaussian-smoothed sliced Wasserstein distance to be O(n^(-1/2)). Other properties, including continuity, are derived with respect to the smoothing parameter. Empirical studies support these findings in the context of privacy-preserving domain adaptation.
Low GrooveSquid.com (original content) Low Difficulty Summary
The paper is about a new way to compare probability distributions while keeping data private. This method is called Gaussian-smoothed sliced Wasserstein distance and it works similarly to its non-private version. The authors want to understand how well this method works, so they study its properties. They show that the method keeps working even when we add some noise to it, which helps keep data private. They also find out how fast this method converges and what happens when we change a certain parameter. Finally, they test their findings in a real-world problem called privacy-preserving domain adaptation.

Keywords

* Artificial intelligence  * Domain adaptation  * Probability