Summary of Transformerlsr: Attentive Joint Model Of Longitudinal Data, Survival, and Recurrent Events with Concurrent Latent Structure, by Zhiyue Zhang et al.
TransformerLSR: Attentive Joint Model of Longitudinal Data, Survival, and Recurrent Events with Concurrent Latent Structure
by Zhiyue Zhang, Yao Zhao, Yanxun Xu
First submitted to arxiv on: 4 Apr 2024
Categories
- Main: Machine Learning (stat.ML)
- Secondary: Machine Learning (cs.LG); Applications (stat.AP); Methodology (stat.ME)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary A new deep learning framework, called TransformerLSR, is proposed for jointly modeling longitudinal measurements, recurrent events, and survival data in applications such as biomedical studies and epidemiology. The existing joint models are limited by heavy parametric assumptions and scalability issues. To address these limitations, the authors incorporate deep learning techniques into the joint modeling process. Specifically, they develop a transformer-based framework that integrates deep temporal point processes to model the dependencies between recurrent events, terminal events (such as death), and past longitudinal measurements. The proposed framework also introduces a novel trajectory representation and model architecture to potentially incorporate prior knowledge of known latent structures among concurrent variables. Simulation studies and analysis of a real-world medical dataset demonstrate the effectiveness and necessity of TransformerLSR. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary Joint modeling of recurrent events, survival data, and longitudinal measurements is crucial in biomedical studies, epidemiology, and social sciences. Researchers have developed various joint models, but they often rely on heavy parametric assumptions and struggle with scalability issues. To overcome these limitations, a new framework called TransformerLSR is proposed, which incorporates deep learning techniques into the joint modeling process. This framework uses transformer-based architecture to model dependencies between events and measurements, and also introduces novel trajectory representation and model architecture. Simulation studies and analysis of real-world data demonstrate the effectiveness and importance of this framework. |
Keywords
* Artificial intelligence * Deep learning * Transformer