Abstract
In this paper we study algorithms to find a Gaussian approximation to a target measure defined on a Hilbert space of functions; the target measure itself is defined via its density with respect to a reference Gaussian measure. We employ the Kullback-Leibler divergence as a distance and find the best Gaussian approximation by minimizing this distance. It then follows that the approximate Gaussian must be equivalent to the Gaussian reference measure, defining a natural function space setting for the underlying calculus of variations problem. We introduce a computational algorithm which is well-adapted to the required minimization, seeking to find the mean as a function, and parameterizing the covariance in two different ways: through low rank perturbations of the reference covariance and through Schrödinger potential perturbations of the inverse reference covariance. Two applications are shown: to a nonlinear inverse problem in elliptic PDEs and to a conditioned diffusion process. These Gaussian approximations also serve to provide a preconditioned proposal distribution for improved preconditioned Crank-Nicolson Monte Carlo-Markov chain sampling of the target distribution. This approach is not only well-adapted to the high dimensional setting, but also behaves well with respect to small observational noise (resp., small temperatures) in the inverse problem (resp., conditioned diffusion).
Original language | English |
---|---|
Pages (from-to) | A2733-A2757 |
Journal | SIAM Journal on Scientific Computing |
Volume | 37 |
Issue number | 6 |
DOIs | |
Publication status | Published - 1 Jan 2015 |
Keywords
- Gaussian distributions
- Inverse problems
- Kullback-Leibler divergence
- MCMC
- Relative entropy
ASJC Scopus subject areas
- Computational Mathematics
- Applied Mathematics