Loading Now

Summary of Hhgt: Hierarchical Heterogeneous Graph Transformer For Heterogeneous Graph Representation Learning, by Qiuyu Zhu et al.


HHGT: Hierarchical Heterogeneous Graph Transformer for Heterogeneous Graph Representation Learning

by Qiuyu Zhu, Liang Zhang, Qianxiong Xu, Kaijun Liu, Cheng Long, Xiaoyang Wang

First submitted to arxiv on: 18 Jul 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Databases (cs.DB)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
Despite the success of Heterogeneous Graph Neural Networks (HGNNs) in modeling real-world Heterogeneous Information Networks (HINs), existing methods have limitations. Research has turned to Graph Transformers (GTs) for enhanced HIN representation learning, but this field is still underdeveloped. Two key shortcomings in current work are the uniform treatment of neighbors and the mixing of nodes with different types during aggregation. To address these gaps, we propose an innovative structure called (k,t)-ring neighborhood that organizes nodes by distance and type. This structure enables the development of a novel Hierarchical Heterogeneous Graph Transformer (HHGT) model that integrates Type-level and Ring-level Transformers for aggregating nodes of diverse types and distances. Our approach is evaluated on downstream tasks, achieving notable improvements in node clustering tasks, with up to 24.75% and 29.25% increases in NMI and ARI, respectively, compared to the best baseline.
Low GrooveSquid.com (original content) Low Difficulty Summary
Researchers have been working on creating better models for understanding complex networks like social media or scientific collaborations. Current methods are limited because they don’t take into account the different types of connections between people or things. To solve this problem, we created a new way of organizing these connections based on distance and type. This allowed us to develop a new model that can learn more effectively from these complex networks. Our model was tested on some common tasks and performed much better than previous methods.

Keywords

* Artificial intelligence  * Clustering  * Representation learning  * Transformer