Summary of A Third-order Finite Difference Weighted Essentially Non-oscillatory Scheme with Shallow Neural Network, by Kwanghyuk Park et al.
A third-order finite difference weighted essentially non-oscillatory scheme with shallow neural network
by Kwanghyuk Park, Xinjuan Chen, Dongjin Lee, Jiaxi Gu, Jae-Hun Jung
First submitted to arxiv on: 8 Jul 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Neural and Evolutionary Computing (cs.NE); Numerical Analysis (math.NA)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary This paper presents a novel neural network-based approach for solving hyperbolic conservation laws using finite difference weighted essentially non-oscillatory (WENO) schemes. The authors design two loss functions, one based on mean squared error and another on mean squared logarithmic error, to train the neural network. The WENO3-JS weights are used as labels, and the loss function is divided into two components: one that enforces the neural network to follow WENO properties, and another that matches output weights with linear weights. This leads to improved performance around discontinuities. The shallow neural network (SNN) structure is chosen for computational efficiency. Results show outperformed results in one-dimensional examples and improved behavior in two-dimensional examples compared to WENO3-JS and WENO3-Z simulations. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary This paper makes a new approach for solving some math problems using special computer algorithms called finite difference weighted essentially non-oscillatory (WENO) schemes. They create two ways to measure how well the algorithm works, one that checks if it’s following certain rules, and another that compares its results with simpler ones. This helps the algorithm do better around areas where there are big changes. The computer program they use is simple and efficient. When tested on math problems, this new approach does better than other methods. |
Keywords
» Artificial intelligence » Loss function » Neural network