Loading Now

Summary of Nonparametric Estimation Of Hawkes Processes with Rkhss, by Anna Bonnet and Maxime Sangnier


Nonparametric estimation of Hawkes processes with RKHSs

by Anna Bonnet, Maxime Sangnier

First submitted to arxiv on: 1 Nov 2024

Categories

  • Main: Machine Learning (stat.ML)
  • Secondary: Machine Learning (cs.LG); Methodology (stat.ME)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper introduces a novel approach for nonparametric estimation of nonlinear multivariate Hawkes processes, which models complex interactions between neurons in neuroscience applications. The model incorporates reproducing kernel Hilbert space (RKHS) interaction functions, allowing for both exciting and inhibiting effects as well as refractory periods. To overcome methodological challenges, the authors propose workarounds including a representer theorem for approximated log-likelihood and least-squares criteria. They also introduce an estimation method relying on two simple approximations of the ReLU function and integral operator. The paper provides approximation bounds justifying the negligible statistical effect of these approximations and presents numerical results on synthetic data demonstrating the proposed estimator’s good asymptotic behavior and performance compared to related techniques.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper helps us better understand how neurons in our brain talk to each other. It creates a new way to model this process, which is important for understanding things like how memories are formed or how we learn. The method uses something called a “reproducing kernel Hilbert space” (don’t worry if that sounds weird!) and some clever math tricks to make it work. The authors tested their method on fake data and showed that it works really well and is better than other methods people have tried before.

Keywords

» Artificial intelligence  » Log likelihood  » Relu  » Synthetic data