Summary of Accelerating Fractional Pinns Using Operational Matrices Of Derivative, by Tayebeh Taheri et al.
Accelerating Fractional PINNs using Operational Matrices of Derivative
by Tayebeh Taheri, Alireza Afzal Aghaei, Kourosh Parand
First submitted to arxiv on: 25 Jan 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Numerical Analysis (math.NA)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary This paper proposes a novel operational matrix method to accelerate the training of Fractional Physics-Informed Neural Networks (fPINNs). The approach uses a non-uniform discretization of the fractional Caputo operator, enabling efficient computation of fractional derivatives. The methodology replaces automatic differentiation with a matrix-vector product and is compatible with any neural network architecture. Specifically, it enhances the accuracy of PINNs when using the Legendre Neural Block (LNB) architecture, which incorporates Legendre polynomials into the PINN structure. The effectiveness of this method is demonstrated across various differential equations, including delay differential equations and systems of differential algebraic equations. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary This paper helps us learn faster and more accurately about complex problems that involve time or space. It’s like using a shortcut to solve these problems! The scientists developed a new way to make computers learn by using something called “fractional neural networks.” This new method is better than the old one because it can handle problems with missing information or unknown patterns. They tested this method on many different types of problems and showed that it works really well. This is important because we can use these methods to solve many real-world problems, like predicting how a disease will spread or designing new materials. |
Keywords
* Artificial intelligence * Neural network