Loading Now

Summary of Why Does Dropping Edges Usually Outperform Adding Edges in Graph Contrastive Learning?, by Yanchen Xu et al.


Why Does Dropping Edges Usually Outperform Adding Edges in Graph Contrastive Learning?

by Yanchen Xu, Siqi Huang, Hongyuan Zhang, Xuelong Li

First submitted to arxiv on: 11 Dec 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
In this paper, researchers investigate the limitations of graph contrastive learning (GCL) and propose a new method to improve its performance. Specifically, they analyze why dropping edges typically outperforms adding edges in GCL and introduce a novel metric called Error Passing Rate (EPR). The proposed algorithm, Error-PAssing-based Graph Contrastive Learning (EPAGCL), uses both edge adding and edge dropping as augmentations. Experiments on real-world datasets validate the theoretical analysis and effectiveness of EPAGCL.
Low GrooveSquid.com (original content) Low Difficulty Summary
GCL is a self-supervised learning method that helps graphs learn to represent themselves without labeled data. However, researchers have struggled to apply stable graph augmentation to generate proper views for contrastive learning. This paper explores why dropping edges usually outperforms adding edges in GCL and proposes a new algorithm called EPAGCL. The algorithm uses both edge adding and edge dropping based on the Error Passing Rate metric. The results show that EPAGCL can improve the performance of GCL.

Keywords

» Artificial intelligence  » Self supervised