Loading Now

Summary of Leveraging 2d Information For Long-term Time Series Forecasting with Vanilla Transformers, by Xin Cheng et al.


Leveraging 2D Information for Long-term Time Series Forecasting with Vanilla Transformers

by Xin Cheng, Xiuying Chen, Shuqi Li, Di Luo, Xun Wang, Dongyan Zhao, Rui Yan

First submitted to arxiv on: 22 May 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper introduces GridTST, a novel approach to time series prediction using a Transformer architecture. By combining two existing methods, GridTST addresses challenges in learning variate-centric representations and missing temporal information. The model encodes input data as a grid, where the x-axis represents time steps and the y-axis represents variates. This enables efficient processing of information across both time and variate dimensions, enhancing the model’s analytical strength. The authors integrate a patch technique to retain local semantic information in the embedding, leading to state-of-the-art performance on various real-world datasets.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper is about making computers better at predicting what will happen next based on patterns they see in data from different times and places. It uses a special kind of computer model called GridTST that combines two other models together. This helps the computer learn more about how things are related over time and across different areas. The authors also find a way to keep important details local, which makes their predictions even better. They test this on real data from all sorts of places and it works really well.

Keywords

» Artificial intelligence  » Embedding  » Time series  » Transformer