Loading Now

Summary of Lsttn: a Long-short Term Transformer-based Spatio-temporal Neural Network For Traffic Flow Forecasting, by Qinyao Luo et al.


LSTTN: A Long-Short Term Transformer-based Spatio-temporal Neural Network for Traffic Flow Forecasting

by Qinyao Luo, Silu He, Xing Han, Yuhan Wang, Haifeng Li

First submitted to arxiv on: 25 Mar 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI); Social and Information Networks (cs.SI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
A novel framework, LSTTN, is proposed for accurate traffic forecasting. It comprehensively considers long- and short-term features in historical traffic flow using a Transformer-based approach. The model first learns compressed and contextual subseries temporal representations from long historical series through masked subseries Transformer pretraining. Then, it extracts long-term trends using stacked 1D dilated convolution layers and periodic features using dynamic graph convolution layers. For fine-grained short-term temporal features, a short-term trend extractor is used. The fused results are then used for prediction. Experiments on four real-world datasets show that LSTTN achieves significant improvements over baseline models in long-term forecasting (5.63%-16.78%). The code is available at this GitHub link.
Low GrooveSquid.com (original content) Low Difficulty Summary
A new way to predict traffic flow has been developed. It looks at both short-term and long-term patterns in past traffic data to make better predictions about the future. The method uses a special kind of computer model called a Transformer to learn from historical traffic information. This helps the model understand complex trends and periodic features in traffic flow. The predicted results are then combined with current traffic data to make more accurate predictions. Tests on real-world datasets show that this new approach can improve prediction accuracy by up to 16.78%.

Keywords

* Artificial intelligence  * Pretraining  * Transformer