Summary of One-shot Federated Learning with Bayesian Pseudocoresets, by Tim D’hondt et al.
One-Shot Federated Learning with Bayesian Pseudocoresets
by Tim d’Hondt, Mykola Pechenizkiy, Robert Peharz
First submitted to arxiv on: 4 Jun 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Artificial Intelligence (cs.AI); Distributed, Parallel, and Cluster Computing (cs.DC); Machine Learning (stat.ML)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary A Bayesian federated learning (FL) approach is proposed that reduces the communication cost between server and clients by solving a global inference problem as a product of local client posteriors. The technique avoids destructive collapse of posterior modes, allowing for efficient FL even with multi-modal likelihoods like neural networks. By using approximate inference in function-space representations of client posteriors, the algorithm achieves state-of-the-art prediction performance while reducing communication cost by up to two orders of magnitude. Additionally, it provides well-calibrated uncertainty estimates. The proposed Bayesian FL algorithm is based on learning Bayesian pseudocoresets and offers a tractable solution for distributed function-space inference. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary Scientists have developed a new way to train machines together without sharing all their data. This approach is called federated learning, or FL for short. Normally, training machine models requires sending lots of information back and forth between the servers and devices doing the training. But this new method only needs to send one piece of information, making it much faster. The team behind this innovation used a special type of math called Bayesian inference to make it work. This new approach not only saves time but also gives better results and can even predict how certain it is about its answers. |
Keywords
» Artificial intelligence » Bayesian inference » Federated learning » Inference » Multi modal