Loading Now

Summary of Sum Of Squares Circuits, by Lorenzo Loconte et al.


Sum of Squares Circuits

by Lorenzo Loconte, Stefan Mengel, Antonio Vergari

First submitted to arxiv on: 21 Aug 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI); Computational Complexity (cs.CC); Algebraic Geometry (math.AG)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
A novel framework for designing expressive generative models that support exact and efficient inference is presented, focusing on probabilistic circuits (PCs) with negative parameters. The study explores the trade-off between tractability and expressiveness, building upon recent advances in squared PCs. The authors prove that squared PCs can be less expressive than monotonic PCs, while also introducing a new class of PCs called sum of squares PCs, which can be exponentially more expressive than both squared and monotonic PCs. An expressiveness hierarchy is constructed around sum of squares PCs, unifying and separating various tractable model classes, including Born Machines and PSD models. The effectiveness of these circuits in performing distribution estimation is empirically demonstrated.
Low GrooveSquid.com (original content) Low Difficulty Summary
Expressive generative models are important in machine learning. This paper looks at a special type of model called probabilistic circuits (PCs) that can generate data that matches a certain probability distribution. PCs with negative parameters, also known as squared PCs, have been shown to be very expressive. However, this study finds that squared PCs can actually be less expressive than another type of PC called monotonic PCs. The authors also introduce a new class of PCs called sum of squares PCs that are even more expressive. They show how these circuits can be used for tasks like estimating probability distributions and demonstrate their effectiveness in an experiment.

Keywords

» Artificial intelligence  » Inference  » Machine learning  » Probability