Loading Now

Summary of Deja Vu: Contrastive Historical Modeling with Prefix-tuning For Temporal Knowledge Graph Reasoning, by Miao Peng et al.


Deja vu: Contrastive Historical Modeling with Prefix-tuning for Temporal Knowledge Graph Reasoning

by Miao Peng, Ben Liu, Wenjie Xu, Zihao Jiang, Jiahui Zhu, Min Peng

First submitted to arxiv on: 25 Mar 2024

Categories

  • Main: Artificial Intelligence (cs.AI)
  • Secondary: Computation and Language (cs.CL); Machine Learning (cs.LG)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
A novel approach to Temporal Knowledge Graph Reasoning (TKGR) is proposed in this paper. The task involves inferring missing facts for incomplete temporal knowledge graphs (TKGs), which has gained significant attention recently. To address the limitations of existing text-based methods, the authors introduce ChapTER, a Contrastive historical modeling framework with prefix-tuning for TEmporal Reasoning. This framework utilizes pseudo-Siamese encoders to balance textual and temporal information through contrastive estimation between queries and candidates. The authors also introduce virtual time prefix tokens to facilitate frozen pre-trained language models (PLMs) for TKGR tasks under different settings. Experimental results demonstrate that ChapTER achieves superior performance compared to competitive baselines with only 0.17% tuned parameters.
Low GrooveSquid.com (original content) Low Difficulty Summary
A new way is found to understand complex situations in the future by looking at what happened in the past. This is called Temporal Knowledge Graph Reasoning (TKGR). The task is to fill in missing information about events that happened before or after a specific time. To do this, researchers use language models, which are like super-smart computers that can read and understand text. However, these models were not designed for TKGR and need adjustments to work well. The authors of this paper propose a new way to adjust the models called ChapTER. It’s like a special sauce that helps the model learn from past events to make better predictions about the future. In tests, ChapTER performed better than other methods with only a small amount of extra information.

Keywords

» Artificial intelligence  » Attention  » Knowledge graph