Loading Now

Summary of Mixture Of Link Predictors on Graphs, by Li Ma et al.


by Li Ma, Haoyu Han, Juanhui Li, Harry Shomer, Hui Liu, Xiaofeng Gao, Jiliang Tang

First submitted to arxiv on: 13 Feb 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The proposed Link-MoE model is a significant advancement in link prediction, which aims to forecast unseen connections in graphs. By recognizing that different node pairs within the same dataset require varied pairwise information for accurate prediction, Link-MoE utilizes various Graph Neural Networks (GNNs) as experts and strategically selects the appropriate expert for each node pair based on various types of pairwise information. This approach leads to substantial performance improvements compared to state-of-the-art baselines, with relative gains of 18.71% on the MRR metric for the Pubmed dataset and 9.59% on the Hits@100 metric for the ogbl-ppa dataset.
Low GrooveSquid.com (original content) Low Difficulty Summary
Link prediction is a way to find new connections between things in a graph. This is important because graphs are used to represent relationships between different types of data. The problem with current methods is that they don’t work well when different parts of the graph need different information to make predictions. To fix this, researchers developed a new model called Link-MoE. This model uses many smaller models (called experts) and chooses which one to use based on the type of information needed for each prediction. By doing this, Link-MoE is able to make much more accurate predictions than previous methods.

Keywords

* Artificial intelligence