Loading Now

Summary of Towards Efficient Communication and Secure Federated Recommendation System Via Low-rank Training, by Ngoc-hieu Nguyen et al.


Towards Efficient Communication and Secure Federated Recommendation System via Low-rank Training

by Ngoc-Hieu Nguyen, Tuan-Anh Nguyen, Tuan Nguyen, Vu Tien Hoang, Dung D. Le, Kok-Seng Wong

First submitted to arxiv on: 8 Jan 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Cryptography and Security (cs.CR); Distributed, Parallel, and Cluster Computing (cs.DC); Information Retrieval (cs.IR)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The Federated Recommendation (FedRec) system aims to protect user data by transmitting neural network models between devices and a central server. However, this process incurs communication costs, leading to computational overheads, model specificity constraints, and compatibility issues with secure aggregation protocols. To address these challenges, the CoLR framework adjusts lightweight trainable parameters while keeping most parameters frozen. This approach reduces communication overheads without introducing additional computational burdens, remaining compatible with secure aggregation protocols like Homomorphic Encryption. The proposed method achieves a significant reduction of up to 93.75% in payload size, resulting in only an approximate 8% decrease in recommendation performance across datasets.
Low GrooveSquid.com (original content) Low Difficulty Summary
The CoLR framework is a new way to protect user data by sending recommendations between devices and a central server. Right now, this process takes a lot of time and can cause problems with the models being used. To solve these issues, the researchers came up with an idea that reduces the amount of data sent without making it harder for computers to do their jobs. This new way is also compatible with ways to keep data safe, like Homomorphic Encryption. By using this method, they were able to send less data and still get good results.

Keywords

* Artificial intelligence  * Neural network