Loading Now

Summary of Deformtime: Capturing Variable Dependencies with Deformable Attention For Time Series Forecasting, by Yuxuan Shu et al.


DeformTime: Capturing Variable Dependencies with Deformable Attention for Time Series Forecasting

by Yuxuan Shu, Vasileios Lampos

First submitted to arxiv on: 11 Jun 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The proposed DeformTime neural network architecture tackles the limitation in multivariate time series (MTS) forecasting approaches that focus solely on autoregressive formulations and neglect exogenous indicators. By incorporating deformable attention blocks (DABs), it captures correlated temporal patterns from input data, enhancing forecasting accuracy. The DABs consist of variable DABs, which learn dependencies across variables from different time steps, and temporal DABs, which preserve temporal dependencies in data from previous time steps. Input data transformation is designed to enhance learning from the deformed series while passing through a DAB. Comparative experiments on 6 MTS datasets demonstrate DeformTime’s improved accuracy against competitive methods, reducing mean absolute error by 10% on average, with consistent performance gains across longer forecasting horizons.
Low GrooveSquid.com (original content) Low Difficulty Summary
DeformTime is a new approach to forecasting in multivariate time series (MTS) data. Currently, most methods focus only on how things change over time and forget about other important information. DeformTime tries to fix this by using special blocks that help it learn from different parts of the data. It has two types of blocks: one that looks at how different variables are connected, and another that helps it remember what happened earlier. This makes its forecasts more accurate. The researchers tested DeformTime on many datasets and found that it works better than other methods in most cases.

Keywords

» Artificial intelligence  » Attention  » Autoregressive  » Neural network  » Time series