Summary of Space For Improvement: Navigating the Design Space For Federated Learning in Satellite Constellations, by Grace Kim et al.
Space for Improvement: Navigating the Design Space for Federated Learning in Satellite Constellations
by Grace Kim, Luca Powell, Filip Svoboda, Nicholas Lane
First submitted to arxiv on: 31 Oct 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: None
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary A novel federated learning (FL) framework is proposed for scaling machine learning capabilities on-board spacecraft, addressing the satellite downlink deficit through collaborative training in orbit. Existing FL algorithms are adapted for scenario-specific constraints, but theoretical implementations face limitations hindering real-world deployment. A method is developed for space-ifying existing FL algorithms, evaluated on a novel satellite constellation design and hardware aware testing platform called FLySTacK. The proposed framework provides a 12.5% to 37.5% reduction in model training time compared to leading alternatives. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary Space is becoming an exciting new area for machine learning, with satellites equipped with deep learning capabilities on-board. But there’s a problem: too much data is being generated and not enough transmission opportunities are available. To solve this issue, researchers have been exploring a method called federated learning (FL). FL allows different devices to work together to train models without sharing all their data. However, current implementations of FL for space-based applications face several limitations that prevent them from being used in real-world scenarios. This paper proposes a new way to do FL in space that can be used on entire constellations and reduces training time by 12.5% to 37.5%. |
Keywords
» Artificial intelligence » Deep learning » Federated learning » Machine learning