Abstract
The solution of time dependent differential equations with neural networks has attracted a lot of attention recently. The central idea is to learn the laws that govern the evolution of the solution from data, which might be polluted with random noise. However, in contrast to other machine learning applications, usually a lot is known about the system at hand. For example, for many dynamical systems physical quantities such as energy or (angular) momentum are exactly conserved. Hence, the neural network has to learn these conservation laws from data and they will only be satisfied approximately due to finite training time and random noise. In this paper we present an alternative approach which uses Noether's Theorem to inherently incorporate conservation laws into the architecture of the neural network. We demonstrate that this leads to better predictions for three model systems: the motion of a non-relativistic particle in a three-dimensional Newtonian gravitational potential, the motion of a massive relativistic particle in the Schwarzschild metric and a system of two interacting particles in four dimensions.
Original language | English |
---|---|
Article number | 112234 |
Number of pages | 24 |
Journal | Journal of Computational Physics |
Volume | 488 |
Early online date | 19 May 2023 |
DOIs | |
Publication status | Published - 1 Sept 2023 |
Bibliographical note
Data availabilityA link to the code that was used to generate the data is included in the paper.
Keywords
- Dynamical system
- Lagrangian mechanics
- Machine learning
- Noether's theorem
ASJC Scopus subject areas
- Computational Mathematics
- General Physics and Astronomy
- Applied Mathematics
- Numerical Analysis
- Computer Science Applications
- Modelling and Simulation
- Physics and Astronomy (miscellaneous)