Loading Now

Summary of Tgtod: a Global Temporal Graph Transformer For Outlier Detection at Scale, by Kay Liu et al.


TGTOD: A Global Temporal Graph Transformer for Outlier Detection at Scale

by Kay Liu, Jiahao Ding, MohamadAli Torkamani, Philip S. Yu

First submitted to arxiv on: 1 Dec 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Social and Information Networks (cs.SI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The paper proposes a novel end-to-end Temporal Graph Transformer for Outlier Detection (TGTOD), addressing limitations in existing Transformers for temporal graphs. TGTOD employs global attention to model structural and temporal dependencies, using a hierarchical architecture comprising Patch Transformer, Cluster Transformer, and Temporal Transformer. Experimental results demonstrate the effectiveness of TGTOD, achieving 61% AP improvement on Elliptic, while reducing training time by 44x compared to existing Transformers.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper creates a new way for machines to learn from temporal graphs, making it better at detecting unusual patterns. It uses something called global attention to understand both the structure and timing of events in the graph. The approach is divided into smaller parts, each handled by a different type of transformer, making it more efficient. The results show that this method can improve outlier detection accuracy by 61% on one dataset.

Keywords

» Artificial intelligence  » Attention  » Outlier detection  » Transformer