Loading Now

Summary of Spiking Neural Networks in Vertical Federated Learning: Performance Trade-offs, by Maryam Abbasihafshejani et al.


Spiking Neural Networks in Vertical Federated Learning: Performance Trade-offs

by Maryam Abbasihafshejani, Anindya Maiti, Murtuza Jadliwala

First submitted to arxiv on: 24 Jul 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The abstract presents research on applying Spiking Neural Networks (SNNs) in Vertical Federated Learning (VFL), a technique that enables model training across multiple clients while maintaining data privacy. The authors investigate the benefits and trade-offs of using SNN models in VFL, comparing two federated learning architectures with different privacy and performance implications. They evaluate the setup using CIFAR-10 and CIFAR-100 benchmark datasets along with SNN implementations of VGG9 and ResNET classification models. The results show that SNN models achieve comparable accuracy to traditional Artificial Neural Networks (ANNs) for VFL applications, while being significantly more energy efficient.
Low GrooveSquid.com (original content) Low Difficulty Summary
Federated machine learning helps keep data private by training models on many devices at once. This makes it better than old ways of sharing data. One type of federated learning is called Vertical Federated Learning, or VFL. It works when devices have different information about the same thing. Scientists are using special kinds of neural networks to make this kind of learning faster and more efficient. They’re trying out a new type of network that uses “spikes” instead of regular computer processing. This might help devices use less energy while still getting good results.

Keywords

» Artificial intelligence  » Classification  » Federated learning  » Machine learning  » Resnet