Summary of Pgtnet: a Process Graph Transformer Network For Remaining Time Prediction Of Business Process Instances, by Keyvan Amiri Elyasi et al.
PGTNet: A Process Graph Transformer Network for Remaining Time Prediction of Business Process Instances
by Keyvan Amiri Elyasi, Han van der Aa, Heiner Stuckenschmidt
First submitted to arxiv on: 9 Apr 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Artificial Intelligence (cs.AI)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary PGTNet is an innovative approach that transforms event logs into graph datasets, enabling the training of Process Graph Transformer Networks (PGT) to predict the remaining time of business process instances. This method consistently outperforms state-of-the-art deep learning approaches on 20 publicly available real-world event logs, particularly excelling in complex processes where existing methods struggle due to limitations in capturing control-flow relationships and long-range dependencies. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary PGTNet is a new way to use computers to predict when things will happen. It looks at logs of events that have already happened and turns them into special kinds of diagrams. Then, it uses these diagrams to teach computers to make predictions about how long it will take for certain things to happen in the future. This approach does better than other computer methods on lots of different real-world event logs. It’s especially good at predicting when complex things are happening. |
Keywords
» Artificial intelligence » Deep learning » Transformer