Loading Now

Summary of Local Superior Soups: a Catalyst For Model Merging in Cross-silo Federated Learning, by Minghui Chen et al.


Local Superior Soups: A Catalyst for Model Merging in Cross-Silo Federated Learning

by Minghui Chen, Meirui Jiang, Xin Zhang, Qi Dou, Zehua Wang, Xiaoxiao Li

First submitted to arxiv on: 31 Oct 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
Federated learning (FL) enables collaborative model training using decentralized data. Pre-trained weight initialization has been shown to improve FL performance, but the increasing complexity of current pre-trained models exacerbates communication cost issues. To address this, we propose “Local Superior Soups,” a model interpolation-based local training technique that enhances exploration of a connected low-loss basin within a few communication rounds. This approach facilitates seamless adaptation of pre-trained models in FL. We demonstrate its effectiveness and efficiency across diverse widely-used FL datasets.
Low GrooveSquid.com (original content) Low Difficulty Summary
Imagine if computers could work together to learn new things, even if they have different information. This is called federated learning (FL). Researchers found that using old knowledge can help with this process. But the more complex the old knowledge becomes, the harder it is for computers to share and learn from each other. To fix this problem, scientists developed a new way to train models using a combination of local learning and sharing information between computers. This method helps computers learn quickly and efficiently. The results show that this approach works well on various datasets.

Keywords

» Artificial intelligence  » Federated learning