Projects per year
Abstract
Low-rank tensor approximations have shown great potential for uncertainty quantification in high dimensions, for example, to build surrogate models that can be used to speed up large-scale inference problems [M. Eigel, M. Marschall, and R. Schneider, Inverse Problems, 34 (2018), 035010; S. Dolgov et al., Stat. Comput., 30 (2020), pp. 603–625]. The feasibility and efficiency of such approaches depends critically on the rank that is necessary to represent or approximate the underlying distribution. In this paper, a priori rank bounds for approximations in the functional Tensor-Train representation for the case of Gaussian models are developed. It is shown that under suitable conditions on the precision matrix, the Gaussian density can be approximated to high accuracy without suffering from an exponential growth of complexity as the dimension increases. These results provide a rigorous justification of the suitability and the limitations of low-rank tensor methods in a simple but important model case. Numerical experiments confirm that the rank bounds capture the qualitative behavior of the rank structure when varying the parameters of the precision matrix and the accuracy of the approximation. Finally, the practical relevance of the theoretical results is demonstrated in the context of a Bayesian filtering problem.
Original language | English |
---|---|
Pages (from-to) | 1191-1224 |
Journal | SIAM/ASA Journal on Uncertainty Quantification |
Volume | 10 |
Issue number | 3 |
Early online date | 28 Sept 2022 |
DOIs | |
Publication status | Published - 30 Sept 2022 |
Bibliographical note
Funding Information:∗Received by the editors October 4, 2021; accepted for publication (in revised form) March 30, 2022; published electronically September 28, 2022. https://doi.org/10.1137/21M1450604 Funding: This research was supported by the German Research Foundation (DFG) within the projects STE 571/16-1 and DFG-SPP 2298 “Theoretical Foundations of Deep Learning.” †TU Berlin, D-10587 Berlin, Germany ([email protected], [email protected], steidl@math. tu-berlin.de).
Publisher Copyright:
© 2022 Society for Industrial and Applied Mathematics.
Keywords
- math.NA
- cs.NA
- math.ST
- stat.TH
- 15A23, 15A69, 65C60, 65D32, 65D15, 41A10
- Markov chain Monte Carlo
- inverse problems
ASJC Scopus subject areas
- Applied Mathematics
- Discrete Mathematics and Combinatorics
- Statistics and Probability
- Statistics, Probability and Uncertainty
- Modelling and Simulation
Fingerprint
Dive into the research topics of 'Rank Bounds for Approximating Gaussian Densities in the Tensor-Train Format'. Together they form a unique fingerprint.Projects
- 1 Finished
-
Tensor decomposition sampling algorithms for Bayesian inverse problems
Dolgov, S. (PI)
Engineering and Physical Sciences Research Council
1/03/21 → 28/02/25
Project: Research council