Loading Now

Summary of Convergence Of Manifold Filter-combine Networks, by David R. Johnson et al.


Convergence of Manifold Filter-Combine Networks

by David R. Johnson, Joyce Chew, Siddharth Viswanath, Edward De Brouwer, Deanna Needell, Smita Krishnaswamy, Michael Perlmutter

First submitted to arxiv on: 18 Oct 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Signal Processing (eess.SP); Machine Learning (stat.ML)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
In this paper, researchers introduce Manifold Filter-Combine Networks (MFCNs) to better understand manifold neural networks (MNNs). MFCNs use a filter-combine framework similar to aggregate-combine paradigms for graph neural networks (GNNs), leading to new families of MNNs. The authors propose a method for implementing MFCNs on high-dimensional point clouds by approximating the manifold with a sparse graph, showing that this approach is consistent and converges to a continuum limit as the number of data points increases.
Low GrooveSquid.com (original content) Low Difficulty Summary
Manifold neural networks (MNNs) are a type of artificial intelligence designed to work with complex data. Researchers created a new kind of MNN called Manifold Filter-Combine Networks (MFCNs). These networks have a special way of combining information, similar to how graph neural networks combine information for images. The authors showed that their method works well on big datasets and gets closer to the ideal result as more data is added.

Keywords

* Artificial intelligence