Summary of Interpretable Mixture Of Experts For Time Series Prediction Under Recurrent and Non-recurrent Conditions, by Zemian Ke et al.
Interpretable mixture of experts for time series prediction under recurrent and non-recurrent conditions
by Zemian Ke, Haocheng Duan, Sean Qian
First submitted to arxiv on: 5 Sep 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Signal Processing (eess.SP)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary This study proposes a novel Mixture of Experts (MoE) model that improves traffic speed prediction under two distinct conditions: recurrent patterns and incident-agnostic patterns. The MoE uses separate expert models, Temporal Fusion Transformers, to capture the unique patterns of each condition. A training pipeline is developed for non-recurrent models to address limited data issues. Evaluations on a real road network show that the MoE achieves lower errors compared to benchmark algorithms. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary This study helps improve traffic speed predictions by using different models for recurring and incident-based patterns. It creates a new model called Mixture of Experts (MoE) that uses special tools to learn from these two types of patterns separately. The researchers also developed a way to train the non-recurring pattern model when there isn’t much data available. By testing this model on real traffic data, they found it was more accurate than other models. |
Keywords
» Artificial intelligence » Mixture of experts