Summary of Pre-trained Graphformer-based Ranking at Web-scale Search (extended Abstract), by Yuchen Li et al.
Pre-trained Graphformer-based Ranking at Web-scale Search (Extended Abstract)
by Yuchen Li, Haoyi Xiong, Linghe Kong, Zeyi Sun, Hongyang Chen, Shuaiqiang Wang, Dawei Yin
First submitted to arxiv on: 25 Sep 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Information Retrieval (cs.IR)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary This paper proposes a novel approach to learning to rank (LTR) by integrating Transformer and Graph Neural Networks (GNNs) into a unified framework. The authors highlight that current approaches focus on either ranking score regression or link prediction, but there is a need for a single model that can handle both tasks at web scale. To address this challenge, they introduce the MPGraf model, which leverages modular and capsule-based pre-training to cohesively integrate the strengths of Transformers and GNNs. The authors conduct extensive offline and online experiments to evaluate the performance of MPGraf. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary Learning to rank (LTR) is a challenging task that requires a deep understanding of both ranking score regression and link prediction within query-webpage bipartite graphs. Currently, Transformer-based models are used for ranking score regression, while Graph Neural Networks (GNNs) excel at link prediction. To address the distributional shifts between these two domains, researchers have pre-trained GNNs or Transformers on source datasets and fine-tuned them on sparsely annotated LTR datasets. However, this approach has its limitations. In this paper, the authors propose a novel framework that integrates both Transformer and GNN models into a single model, known as MPGraf. |
Keywords
» Artificial intelligence » Gnn » Regression » Transformer