Summary of Survey and Taxonomy: the Role Of Data-centric Ai in Transformer-based Time Series Forecasting, by Jingjing Xu et al.
Survey and Taxonomy: The Role of Data-Centric AI in Transformer-Based Time Series Forecasting
by Jingjing Xu, Caesar Wu, Yuan-Fang Li, Gregoire Danoy, Pascal Bouvry
First submitted to arxiv on: 29 Jul 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Artificial Intelligence (cs.AI)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The paper explores the concept of data-centric AI, which emphasizes the crucial role of data in the machine learning training process. Researchers have developed sophisticated models like the Transformer Architecture, capable of performing well in multiple domains such as NLP, CV, and TSF. However, model performance is heavily dependent on input preprocessing and output evaluation, justifying a data-centric approach. The authors argue that data-centric AI is essential for efficiently training transformer-based TSF models, but there is a gap regarding the integration of these two concepts. This survey aims to fill this gap through an extensive literature review based on a proposed taxonomy. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary The paper talks about how we need to improve Artificial Intelligence (AI) by focusing on data, not just building better AI models. They show that one type of model called the Transformer Architecture can do many things well, like understand language or see pictures. But it only works if we prepare the data correctly and measure its performance correctly too. The authors say that thinking about data is super important when training these kinds of models. They want to help bridge the gap between these two ideas by looking at past research and making a plan for future work. |
Keywords
» Artificial intelligence » Machine learning » Nlp » Transformer