Loading Now

Summary of Federated Domain Generalization with Label Smoothing and Balanced Decentralized Training, by Milad Soltany et al.


Federated Domain Generalization with Label Smoothing and Balanced Decentralized Training

by Milad Soltany, Farhad Pourpanah, Mahdiyar Molahasani, Michael Greenspan, Ali Etemad

First submitted to arxiv on: 16 Dec 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The proposed Federated Domain Generalization with Label Smoothing and Balanced Decentralized Training (FedSB) approach tackles data heterogeneity challenges within a federated learning framework. FedSB utilizes label smoothing at the client level to prevent overfitting, enhancing generalization capabilities across diverse domains when aggregating local models into a global model. Additionally, FedSB incorporates a decentralized budgeting mechanism which balances training among clients, improving the performance of the aggregated global model. Experimental results on four multi-domain datasets demonstrate that FedSB outperforms competing methods, achieving state-of-the-art results on three out of four datasets.
Low GrooveSquid.com (original content) Low Difficulty Summary
FedSB is a new way to share information between different groups without sharing their individual data. It uses a special technique called label smoothing to make sure the models don’t get too good at one specific type of data and can’t apply what they learned to other types of data as well. FedSB also balances how much each group contributes to the shared model, which helps it perform better overall. Tests on four different groups of data show that FedSB is more effective than similar approaches.

Keywords

» Artificial intelligence  » Domain generalization  » Federated learning  » Generalization  » Overfitting