Loading Now

Summary of In-flow: Instance Normalization Flow For Non-stationary Time Series Forecasting, by Wei Fan et al.


IN-Flow: Instance Normalization Flow for Non-stationary Time Series Forecasting

by Wei Fan, Shun Zheng, Pengyang Wang, Rui Xie, Kun Yi, Qi Zhang, Jiang Bian, Yanjie Fu

First submitted to arxiv on: 30 Jan 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper tackles the problem of time series forecasting in non-stationary environments where distribution shifts occur. Existing methods either rely on fixed statistics or are limited to specific network architectures. The authors propose a decoupled formulation that separates the removal of the shift from the forecasting process, formalized as a bi-level optimization problem. They also introduce instance normalization flow (IN-Flow), an invertible network designed for time series transformation. IN-Flow differs from traditional normalizing flows by stacking normalization layers and flow-based invertible networks to creatively transform time series distributions. The authors demonstrate the effectiveness of their method through extensive experiments on synthetic and real-world data.
Low GrooveSquid.com (original content) Low Difficulty Summary
This research paper is about making predictions about what will happen in the future, but only works if we understand that things change over time. Right now, most methods for making these predictions rely on certain rules or are only good with specific types of data. The authors came up with a new way to make predictions that doesn’t rely on those rules and can work with different types of data. They also created a special kind of network called instance normalization flow (IN-Flow) that helps them transform the data in a way that makes it easier to predict what will happen next. By testing their method on both fake and real data, they showed that it’s actually better than other methods at making predictions.

Keywords

* Artificial intelligence  * Optimization  * Time series