Loading Now

Summary of Time Series Imputation with Multivariate Radial Basis Function Neural Network, by Chanyoung Jung and Yun Jang


Time Series Imputation with Multivariate Radial Basis Function Neural Network

by Chanyoung Jung, Yun Jang

First submitted to arxiv on: 24 Jul 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The researchers aim to address the issue of missing values in time series data by proposing a novel imputation model based on Radial Basis Functions Neural Network (RBFNN). The model learns local information from timestamps to create a continuous function, incorporating time gaps to facilitate learning about missing terms. This model is called Missing Imputation Multivariate RBFNN (MIM-RBFNN), but it has limitations in utilizing temporal information. To address this, they propose an extension, the Missing Value Imputation Recurrent Neural Network with Continuous Function (MIRNN-CF), which uses the continuous function generated by MIM-RBFNN. The performance of both models is evaluated using two real-world datasets with non-random and random missing patterns.
Low GrooveSquid.com (original content) Low Difficulty Summary
The researchers are working on a problem where some data values are missing, but they have no idea what those values should be. They found that one way to solve this problem is by using something called Radial Basis Functions Neural Network (RBFNN), which does a really good job of guessing the missing values. The new model they’re proposing uses RBFNN and adds some extra information from time stamps, so it can do an even better job. They also came up with another idea that combines this model with something called Recurrent Neural Network, to make it even better. They tested both models using real data sets.

Keywords

» Artificial intelligence  » Neural network  » Time series