Loading Now

Summary of Budgeted Online Continual Learning by Adaptive Layer Freezing and Frequency-based Sampling, By Minhyuk Seo et al.


Budgeted Online Continual Learning by Adaptive Layer Freezing and Frequency-based Sampling

by Minhyuk Seo, Hyunseo Koh, Jonghyun Choi

First submitted to arxiv on: 19 Oct 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI); Computer Vision and Pattern Recognition (cs.CV)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The paper proposes a new framework for comparing and developing online continual learning (CL) algorithms by considering both computational and memory budgets. The authors argue that previous approaches have ignored these factors, making it difficult to fairly compare different CL algorithms. To address this issue, they introduce two new metrics: floating point operations (FLOPs) for computational budget and total memory size in Byte for memory budget. They also propose adaptive layer freezing and a memory retrieval method to reduce computational costs while maintaining accuracy within a limited total budget.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper is about finding ways to improve how computers learn from data over time, without getting too slow or using too much storage space. The authors want to make sure that different methods for doing this are compared fairly, so they come up with two new ways to measure how much work the computer has to do and how much memory it uses. They also suggest some new ideas for making these learning processes more efficient without losing accuracy.

Keywords

» Artificial intelligence  » Continual learning