Loading Now

Summary of Timer: Generative Pre-trained Transformers Are Large Time Series Models, by Yong Liu et al.


Timer: Generative Pre-trained Transformers Are Large Time Series Models

by Yong Liu, Haoran Zhang, Chenyu Li, Xiangdong Huang, Jianmin Wang, Mingsheng Long

First submitted to arxiv on: 4 Feb 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Machine Learning (stat.ML)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
Medium Difficulty Summary: This paper focuses on developing large-scale time series models (LTSMs) that can tackle real-world data-scarce scenarios. Despite deep learning advancements in time series analysis, small models often perform well on current benchmarks due to performance saturation. In contrast, large language models have demonstrated impressive capabilities such as few-shot generalization and scalability. To bridge this gap, the authors propose an early development of LTSMs through pre-training with a unified format called single-series sequence (S3). The study unifies forecasting, imputation, and anomaly detection into a generative task and introduces Time Series Transformer (Timer), which achieves promising results as an LTSM. The authors make their code and datasets available on GitHub.
Low GrooveSquid.com (original content) Low Difficulty Summary
Low Difficulty Summary: This paper tries to solve a problem in time series analysis where small models work well but aren’t very useful in real-world situations with limited data. Large language models have been able to do some amazing things, like learning quickly from small amounts of data. The authors want to take these ideas and apply them to time series analysis. They created a new way of formatting data called single-series sequence (S3) and used it to train a special kind of model that can handle different tasks like forecasting and detecting unusual patterns. This model is called Time Series Transformer, or Timer for short.

Keywords

* Artificial intelligence  * Anomaly detection  * Deep learning  * Few shot  * Generalization  * Time series  * Transformer