Abstract
We investigate the use of models from the theory of regularity structures as features in machine learning tasks. A model is a polynomial function of a space–time signal designed to well-approximate solutions to partial differential equations (PDEs), even in low regularity regimes. Models can be seen as natural multi-dimensional generalisations of signatures of paths; our work therefore aims to extend the recent use of signatures in data science beyond the context of time-ordered data. We provide a flexible definition of a model feature vector associated to a space–time signal, along with two algorithms which illustrate ways in which these features can be combined with linear regression. We apply these algorithms in several numerical experiments designed to learn solutions to PDEs with a given forcing and boundary data. Our experiments include semi-linear parabolic and wave equations with forcing, and Burgers’ equation with no forcing. We find an advantage in favour of our algorithms when compared to several alternative methods. Additionally, in the experiment with Burgers’ equation, we find non-trivial predictive power when noise is added to the observations.
Original language | English |
---|---|
Article number | 13 |
Journal | Journal of Scientific Computing |
Volume | 98 |
Issue number | 1 |
Early online date | 23 Nov 2023 |
DOIs | |
Publication status | Published - 31 Jan 2024 |
Bibliographical note
Funding Information:AG and HW were supported by the Leverhulme Trust through a Philip Leverhulme Prize during the writing of this article. HW was also supported by the Royal Society through the University Research Fellowship UF140187.
Keywords
- Partial differential equations
- Path signatures
- Regression
- Regularity structures
- Supervised learning
ASJC Scopus subject areas
- Software
- Theoretical Computer Science
- Numerical Analysis
- General Engineering
- Computational Mathematics
- Computational Theory and Mathematics
- Applied Mathematics