Loading Now

Summary of Curriculum Negative Mining For Temporal Networks, by Ziyue Chen et al.


Curriculum Negative Mining For Temporal Networks

by Ziyue Chen, Tongya Zheng, Mingli Song

First submitted to arxiv on: 24 Jul 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
A novel Temporal Graph Neural Network (TGNN) training framework, Curriculum Negative Mining (CurNM), is introduced to address the challenges of positive sparsity and shift in temporal networks. The framework adaptively adjusts the difficulty of negative samples by dynamically updating a negative pool that balances random, historical, and hard negatives. Additionally, a temporal-aware negative selection module learns from the disentangled factors of recently active edges to accurately capture shifting preferences. Experimental results on 12 datasets and 3 TGNNs demonstrate significant performance improvements over baseline methods, while ablation studies and parameter sensitivity experiments verify the usefulness and robustness of CurNM.
Low GrooveSquid.com (original content) Low Difficulty Summary
Curriculum Negative Mining is a new way to train Temporal Graph Neural Networks (TGNNs). It helps improve how well TGNNs work by making sure they have good negative samples. In temporal networks, there are two challenges: positive sparsity and shift. Positive sparsity means there’s usually only one positive sample among many negative ones at each time stamp. Positive shift means the positive samples change over time. CurNM addresses these challenges by creating a pool of negative samples that balances random, historical, and hard negatives. It also has a module that learns from recently active edges to capture changing preferences.

Keywords

* Artificial intelligence  * Graph neural network