Abstract
We present a novel offline-online method to mitigate the computational burden of the characterization of posterior random variables in statistical learning. In the offline phase, the proposed method learns the joint law of the parameter random variables and the observable random variables in the tensor-train (TT) format. In the online phase, the resulting order-preserving conditional transport can characterize the posterior random variables given newly observed data in real time. Compared with the state-of-the-art normalizing flow techniques, the proposed method relies on function approximation and is equipped with a thorough performance analysis. The function approximation perspective also allows us to further extend the capability of transport maps in challenging problems with high-dimensional observations and high-dimensional parameters. On the one hand, we present novel heuristics to reorder and/or reparametrize the variables to enhance the approximation power of TT. On the other hand, we integrate the TT-based transport maps and the parameter reordering/reparametrization into layered compositions to further improve the performance of the resulting transport maps. We demonstrate the efficiency of the proposed method on various statistical learning tasks in ordinary differential equations (ODEs) and partial differential equations (PDEs).
Original language | English |
---|---|
Article number | 112103 |
Journal | Journal of Computational Physics |
Volume | 485 |
Early online date | 30 Mar 2023 |
DOIs | |
Publication status | E-pub ahead of print - 30 Mar 2023 |
Keywords
- Approximate Bayesian computation
- Dimension reduction
- Generative models
- Inverse problems
- Markov chain Monte Carlo
- tensor train
- Transport maps
ASJC Scopus subject areas
- Numerical Analysis
- Modelling and Simulation
- Physics and Astronomy (miscellaneous)
- Physics and Astronomy(all)
- Computer Science Applications
- Computational Mathematics
- Applied Mathematics