Loading Now

Summary of Fedlps: Heterogeneous Federated Learning For Multiple Tasks with Local Parameter Sharing, by Yongzhe Jia et al.


FedLPS: Heterogeneous Federated Learning for Multiple Tasks with Local Parameter Sharing

by Yongzhe Jia, Xuyun Zhang, Amin Beheshti, Wanchun Dou

First submitted to arxiv on: 13 Feb 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI); Distributed, Parallel, and Cluster Computing (cs.DC)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The proposed Federated Learning (FL) framework, Heterogeneous Federated Learning with Local Parameter Sharing (FedLPS), addresses the limitations of existing FL approaches by developing a novel heterogeneous model aggregation algorithm and leveraging principles from transfer learning. FedLPS enables the deployment of multiple tasks on a single device by dividing local models into shareable encoders and task-specific encoders, reducing resource consumption while accounting for data and system heterogeneity. Experimental results demonstrate that FedLPS outperforms state-of-the-art FL frameworks by up to 4.88% and reduces computational resource consumption by 21.3%.
Low GrooveSquid.com (original content) Low Difficulty Summary
FedLPS is a new way to help computers learn together without sharing all their data. This makes it better for protecting people’s privacy. Right now, there are some big problems with this kind of learning, like devices running out of power or having different kinds of data. The people who did this research came up with a new way to make it work better. They called it Heterogeneous Federated Learning with Local Parameter Sharing, or FedLPS for short. It’s like a special recipe that lets computers learn from each other and share some information, but not too much. This makes it more efficient and better at learning.

Keywords

* Artificial intelligence  * Federated learning  * Transfer learning