Loading Now

Summary of Federated Hybrid Training and Self-adversarial Distillation: Towards Robust Edge Networks, by Yu Qiao et al.


Federated Hybrid Training and Self-Adversarial Distillation: Towards Robust Edge Networks

by Yu Qiao, Apurba Adhikary, Kitae Kim, Eui-Nam Huh, Zhu Han, Choong Seon Hong

First submitted to arxiv on: 26 Dec 2024

Categories

  • Main: Computer Vision and Pattern Recognition (cs.CV)
  • Secondary: Machine Learning (cs.LG)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
Federated learning (FL) enables collaboration without transmitting raw data, enhancing privacy in mobile edge networks. However, data heterogeneity and attacks challenge developing an unbiased global model for deployment. The proposed Federated hyBrid Adversarial training and self-adversarial disTillation (FedBAT) framework improves robustness and generalization by integrating hybrid adversarial training and self-adversarial distillation into the conventional FL framework from data augmentation and feature distillation perspectives. FedBAT balances accuracy and robustness through weighted standard and adversarial training, while a novel augmentation-invariant adversarial distillation method aligns local features with global clean features to mitigate bias from heterogeneity. Experimental results across multiple datasets demonstrate that FedBAT yields comparable or superior performance gains in improving robustness while maintaining accuracy compared to several baselines.
Low GrooveSquid.com (original content) Low Difficulty Summary
Imagine you want to share data without sharing the actual information. This is called federated learning, and it helps keep things private on mobile devices. However, when different devices have different types of data, it can be hard to make a global model that works well for all of them. To fix this, researchers created a new way to train models called Federated hyBrid Adversarial training and self-adversarial disTillation (FedBAT). It does two things: first, it makes the model more robust by balancing accuracy and protection from attacks. Second, it aligns features from different devices with each other so that they work well together. This helps make a better global model that works for all devices. Tests show that FedBAT can make models better at resisting attacks while still being accurate.

Keywords

» Artificial intelligence  » Data augmentation  » Distillation  » Federated learning  » Generalization