Loading Now

Summary of An Upload-efficient Scheme For Transferring Knowledge From a Server-side Pre-trained Generator to Clients in Heterogeneous Federated Learning, by Jianqing Zhang et al.


An Upload-Efficient Scheme for Transferring Knowledge From a Server-Side Pre-trained Generator to Clients in Heterogeneous Federated Learning

by Jianqing Zhang, Yang Liu, Yang Hua, Jian Cao

First submitted to arxiv on: 23 Mar 2024

Categories

  • Main: Artificial Intelligence (cs.AI)
  • Secondary: Distributed, Parallel, and Cluster Computing (cs.DC)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The proposed Federated Knowledge-Transfer-Loop (FedKTL) framework for heterogeneous federated learning enables task-specific knowledge sharing among clients with different model architectures while preserving privacy. A public pre-trained generator, such as StyleGAN or Stable Diffusion, serves as the bridge, producing task-related prototypical image-vector pairs via server-side inference. Each client can then transfer common knowledge from the generator to its local model through an additional supervised local task. Extensive experiments on four datasets with 14 heterogeneous models demonstrate FedKTL’s superiority over seven state-of-the-art methods, with performance gains up to 7.31%. The framework is applicable in cloud-edge scenarios, even with a single edge client.
Low GrooveSquid.com (original content) Low Difficulty Summary
Heterogeneous Federated Learning (HtFL) helps different devices share knowledge without sharing sensitive data. But it can be hard because devices have different models and data. To solve this, researchers created a system that uses a public generator to help devices learn from each other. This system is called Federated Knowledge-Transfer-Loop (FedKTL). It works by creating special images and vectors that devices can use to learn from each other. The team tested FedKTL on many different datasets and showed it outperformed seven other methods, with some improvements as high as 7.31%. This system could be used in situations where devices need to work together, like cloud-edge scenarios.

Keywords

» Artificial intelligence  » Diffusion  » Federated learning  » Inference  » Supervised