Loading Now

Summary of Unettsf: a Better Performance Linear Complexity Time Series Prediction Model, by Li Chu et al.


UnetTSF: A Better Performance Linear Complexity Time Series Prediction Model

by Li Chu, Xiao Bingjia, Yuan Qiping

First submitted to arxiv on: 5 Jan 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
Recently, Transformer-based models have achieved significant progress in time series prediction, becoming baseline models. This paper proposes the U-Net time series prediction model (UnetTSF), which adopts a linear complexity approach. The novel architecture incorporates Feature Pyramid Network (FPN) technology to extract features from time series data, replacing traditional trend and seasonal decomposition methods. We tested UnetTSF on 8 open-source datasets, achieving 31 out of 32 optimal results compared to the best linear model, DLiner. Our model demonstrated a 10.1% average decrease in Mean Squared Error (MSE) and 9.1% average decrease in Mean Absolute Error (MAE). Compared to the complex Transformer-based PatchTST model, UnetTSF achieved 9 optimal MSE results and 15 optimal MAE results in 32 testing projects.
Low GrooveSquid.com (original content) Low Difficulty Summary
A new way of predicting future values based on past data is being developed. This approach uses a special kind of artificial intelligence called a transformer, which has been very successful at this task. The new method, called UnetTSF, combines two ideas to make better predictions. First, it breaks down the data into smaller parts and then looks for patterns in each part. Second, it uses a technique called Feature Pyramid Network (FPN) to find important features in the data that help with prediction. We tested this new method on many different datasets and found that it performed better than other methods 31 times out of 32. This means that our method is very good at predicting future values based on past data.

Keywords

* Artificial intelligence  * Feature pyramid  * Mae  * Mse  * Time series  * Transformer