Loading Now

Summary of Regularizing and Aggregating Clients with Class Distribution For Personalized Federated Learning, by Gyuejeong Lee and Daeyoung Choi


Regularizing and Aggregating Clients with Class Distribution for Personalized Federated Learning

by Gyuejeong Lee, Daeyoung Choi

First submitted to arxiv on: 12 Jun 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Distributed, Parallel, and Cluster Computing (cs.DC)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The proposed Class-wise Federated Averaging (cwFedAVG) method for personalized federated learning (PFL) addresses the limitations of existing PFL methods by reducing computational and communication costs. cwFedAVG performs Federated Averaging class-wise, creating multiple global models per class on the server. Local models integrate these global models weighted by their estimated local class distribution, derived from L2-norms of deep network weights. The Weight Distribution Regularizer (WDR) is designed to enhance accuracy in estimating local class distributions. Experimental results show that cwFedAVG matches or outperforms existing PFL methods while being computationally efficient and conceptually simple.
Low GrooveSquid.com (original content) Low Difficulty Summary
Personalized federated learning helps create customized models for people with different data. But current ways of doing this can be slow and use too much computing power. This paper introduces a new approach called Class-wise Federated Averaging (cwFedAVG). It works by creating multiple models for each class on the server, and then combining these models based on how well they fit the local data. To make it even better, the authors added a special tool that helps estimate how good each model is. The results show that cwFedAVG performs just as well as other methods but uses less computing power.

Keywords

» Artificial intelligence  » Federated learning