Loading Now

Summary of Condtsf: One-line Plugin Of Dataset Condensation For Time Series Forecasting, by Jianrong Ding et al.


CondTSF: One-line Plugin of Dataset Condensation for Time Series Forecasting

by Jianrong Ding, Zhanyu Liu, Guanjie Zheng, Haiming Jin, Linghe Kong

First submitted to arxiv on: 4 Jun 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
Dataset condensation is a novel technique that generates a small synthetic dataset to train deep neural networks efficiently. The goal is to ensure that models trained with this condensed data perform similarly to those trained on full datasets. However, existing methods primarily focus on classification tasks, posing challenges in adapting them to time series forecasting (TS-forecasting). This discrepancy arises from differences in evaluating synthetic data. In classification, well-distilled synthetic data ensures identical labels for the same input, regardless of output logits distribution. Conversely, TS-forecasting assesses effectiveness by comparing predictions between models. Our proposed Dataset Condensation for Time Series Forecasting (CondTSF) addresses this gap by optimizing the condensation objective for TS-forecasting. We demonstrate a one-line plugin that enhances performance by reducing prediction distance. Extensive experiments on eight time series datasets show that CondTSF consistently improves previous dataset condensation methods, especially at low condensing ratios.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper talks about a new way to make training neural networks faster and cheaper. They want to create a smaller version of the data that can be used for training, which they call “dataset condensation.” The problem is that most existing methods only work well for certain types of tasks, like classifying things. When it comes to forecasting future events, like stock prices or weather, these methods don’t work as well. The researchers propose a new way to do dataset condensation specifically for this type of task. They tested their idea on several datasets and found that it improves performance, especially when you have limited data.

Keywords

» Artificial intelligence  » Classification  » Logits  » Synthetic data  » Time series