Summary of Rothp: Rotary Position Embedding-based Transformer Hawkes Process, by Anningzhe Gao et al.
RoTHP: Rotary Position Embedding-based Transformer Hawkes Process
by Anningzhe Gao, Shan Dai
First submitted to arxiv on: 11 May 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: None
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary This paper proposes a new architecture for Neural Hawkes Processes (NHPs) called Rotary Position Embedding-based THP (RoTHP). NHPs are used to model asynchronous event sequences, such as financial transactions and user behaviors. The RoTHP addresses two issues with previous NHPs: sequence prediction and performance sensitivity to temporal change or noise in sequence data analysis. The proposed architecture uses relative time embeddings, which provide translation invariance and flexibility for sequence prediction tasks. Empirical results show that RoTHP can be better generalized in scenarios with timestamp translations and sequence prediction tasks. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary This paper makes a new kind of model for understanding when things happen. It’s called the Rotary Position Embedding-based THP (RoTHP). This kind of model is used to understand when people do certain actions, like making transactions or posting on social media. The RoTHP solves two big problems with earlier models: it can predict what will happen next and it’s not affected by small changes in time. It does this by using special “embeddings” that help it remember relationships between events over time. The results show that the RoTHP is really good at predicting things when the timing of events might change. |
Keywords
» Artificial intelligence » Embedding » Translation