Loading Now

Summary of On Functional Dimension and Persistent Pseudodimension, by J. Elisenda Grigsby and Kathryn Lindsey


On Functional Dimension and Persistent Pseudodimension

by J. Elisenda Grigsby, Kathryn Lindsey

First submitted to arxiv on: 22 Oct 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Algebraic Geometry (math.AG); Combinatorics (math.CO)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The paper explores the redundancy in ReLU neural networks by introducing two locally applicable complexity measures: local functional dimension and persistent pseudodimension. These measures can be computed on finite batches of points and are related to each other, with the former providing bounds for understanding the mechanics of double descent phenomenon.
Low GrooveSquid.com (original content) Low Difficulty Summary
This research helps us understand how different settings of neural network parameters can produce the same results. It’s like having many keys that can open the same lock! The study introduces two new ways to measure how complex a neural network is in certain areas, which can help explain why some networks work better than others.

Keywords

» Artificial intelligence  » Neural network  » Relu