Summary of Learn More by Using Less: Distributed Learning with Energy-constrained Devices, By Roberto Pereira et al.
Learn More by Using Less: Distributed Learning with Energy-Constrained Devices
by Roberto Pereira, Cristian J. Vaca-Rubio, Luis Blanco
First submitted to arxiv on: 3 Dec 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Distributed, Parallel, and Cluster Computing (cs.DC); Signal Processing (eess.SP)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The proposed LeanFed framework is an energy-aware Federated Learning (FL) solution designed to optimize client selection and training workloads on battery-constrained devices. This approach leverages adaptive data usage by dynamically adjusting the fraction of local data each device utilizes during training, maximizing device participation while ensuring they don’t run out of battery. The framework is evaluated against traditional FedAvg on CIFAR-10 and CIFAR-100 datasets, simulating various levels of data heterogeneity and device participation rates. Results show that LeanFed enhances model accuracy and stability, particularly in settings with high data heterogeneity and limited battery life, by mitigating client dropout and extending device availability. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary LeanFed is a new way to train models on devices with limited battery power. It makes sure each device uses the right amount of its own data during training so it doesn’t run out of battery before finishing. This helps keep more devices participating in the training process, which improves the model’s accuracy and stability. The researchers tested LeanFed on some popular datasets and found that it works well, especially when there is a lot of variation in the data and devices have limited power. |
Keywords
* Artificial intelligence * Dropout * Federated learning