Loading Now

Summary of Beyond Generalization: a Survey Of Out-of-distribution Adaptation on Graphs, by Shuhan Liu et al.


Beyond Generalization: A Survey of Out-Of-Distribution Adaptation on Graphs

by Shuhan Liu, Kaize Ding

First submitted to arxiv on: 17 Feb 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper surveys graph Out-Of-Distribution (OOD) adaptation methods, which aim to mitigate distribution shifts on graphs and adapt knowledge from one distribution to another. The authors formally formulate two problem scenarios: training-time and test-time graph OOD adaptation. They categorize existing methods according to their learning paradigm, discussing techniques such as self-supervised learning and meta-learning. The paper also highlights promising research directions and challenges.
Low GrooveSquid.com (original content) Low Difficulty Summary
This survey aims to help graph machine learning models perform better when the data distribution changes between training and testing. It looks at two main problem scenarios: adapting during training and after training is finished. The authors group existing methods into categories like self-supervised learning and meta-learning, explaining how these techniques work. The paper also suggests areas for further research.

Keywords

* Artificial intelligence  * Machine learning  * Meta learning  * Self supervised