Loading Now

Summary of Memorization Capacity For Additive Fine-tuning with Small Relu Networks, by Jy-yong Sohn et al.


Memorization Capacity for Additive Fine-Tuning with Small ReLU Networks

by Jy-yong Sohn, Dohyun Kwon, Seoyeon An, Kangwook Lee

First submitted to arxiv on: 1 Aug 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Machine Learning (stat.ML)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The paper explores fine-tuning pre-trained models through the lens of memorization capacity. It introduces a new measure, Fine-Tuning Capacity (FTC), which represents the maximum number of samples a neural network can fine-tune or the minimum number of neurons required to arbitrarily change labels. The study analyzes FTC for additive fine-tuning scenarios, deriving tight upper and lower bounds on FTC. Specifically, it shows that 2-layer ReLU networks require m=Θ(N) neurons and 3-layer networks require m=Θ(√N) neurons to fine-tune N samples, regardless of the number of samples K. The results recover known memorization capacity findings when N=K.
Low GrooveSquid.com (original content) Low Difficulty Summary
Fine-tuning large pre-trained models is a common practice in machine learning applications, but its mathematical analysis has been largely unexplored. This paper studies fine-tuning through the lens of memorization capacity. It introduces a new measure called Fine-Tuning Capacity (FTC) and analyzes it for additive fine-tuning scenarios. The study shows that certain types of neural networks can be trained to fine-tune large numbers of samples with relatively few neurons.

Keywords

» Artificial intelligence  » Fine tuning  » Machine learning  » Neural network  » Relu