Summary of Exotst: Exogenous-aware Temporal Sequence Transformer For Time Series Prediction, by Kshitij Tayal et al.
ExoTST: Exogenous-Aware Temporal Sequence Transformer for Time Series Prediction
by Kshitij Tayal, Arvind Renganathan, Xiaowei Jia, Vipin Kumar, Dan Lu
First submitted to arxiv on: 16 Oct 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: None
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The proposed framework, ExoTST, is a novel transformer-based approach for time series prediction that integrates past endogenous and exogenous variables with current exogenous information. It leverages attention mechanisms and introduces a cross-temporal modality fusion module to efficiently incorporate this information. This allows the model to learn from both past and current exogenous series separately, providing robustness against data uncertainties. The framework is evaluated on real-world carbon flux datasets and time series benchmarks, demonstrating superior performance compared to state-of-the-art baselines with improvements of up to 10% in prediction accuracy. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary ExoTST is a new way to make predictions about things that change over time, like the amount of carbon dioxide in the air. It’s different from other methods because it looks at both what happened in the past and what’s happening now. This helps the model be more accurate and robust against mistakes or missing data. The researchers tested ExoTST on real-world datasets and found that it worked better than other methods, with improvements of up to 10%. |
Keywords
» Artificial intelligence » Attention » Time series » Transformer