Loading Now

Summary of Input Snapshots Fusion For Scalable Discrete-time Dynamic Graph Neural Networks, by Qingguo Qi et al.


Input Snapshots Fusion for Scalable Discrete-Time Dynamic Graph Neural Networks

by QingGuo Qi, Hongyang Chen, Minhao Cheng, Han Liu

First submitted to arxiv on: 11 May 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The proposed Input Snapshots Fusion based Dynamic Graph Neural Network (SFDyG) tackles the underexplored problem of temporal edge exploration in discrete-time dynamic graphs. Traditional approaches rely on sequential models, leading to high computational costs. SFDyG combines Hawkes processes with graph neural networks to capture temporal and structural patterns efficiently. By fusing multiple snapshots into a single temporal graph, SFDyG decouples complexity from the number of snapshots, enabling efficient training. Experimental results on eight datasets show that SFDyG outperforms existing methods for future link prediction tasks.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper develops a new way to understand how things change over time in networks. Right now, researchers are trying to figure out how nodes and edges move around each other in real-world situations. They’re using something called dynamic graph representation learning, but it’s still not very good at capturing the flow of time. The authors came up with an idea to use a combination of two techniques: Hawkes processes, which help us understand patterns over time, and graph neural networks, which learn from the structure of the network. This new approach is called SFDyG (Input Snapshots Fusion based Dynamic Graph Neural Network). It’s faster and better than what we have now, and it can even be used on really big datasets.

Keywords

» Artificial intelligence  » Graph neural network  » Representation learning