Kullback-Leibler approximation for probability measures on infinite dimensional spaces

F. J. Pinski, G. Simpson, A. M. Stuart, H. Weber

Research output: Contribution to journalArticlepeer-review

30 Citations (SciVal)


In a variety of applications it is important to extract information from a probability measure μ on an infinite dimensional space. Examples include the Bayesian approach to inverse problems and (possibly conditioned) continuous time Markov processes. It may then be of interest to find a measure ν, from within a simple class of measures, which approximates μ. This problem is studied in the case where the Kullback-Leibler divergence is employed to measure the quality of the approximation. A calculus of variations viewpoint is adopted, and the particular case where ν is chosen from the set of Gaussian measures is studied in detail. Basic existence and uniqueness theorems are established, together with properties of minimizing sequences. Furthermore, parameterization of the class of Gaussians through the mean and inverse covariance is introduced, the need for regularization is explained, and a regularized minimization is studied in detail. The calculus of variations framework resulting from this work provides the appropriate underpinning for computational algorithms.

Original languageEnglish
Pages (from-to)4091-4122
Number of pages32
JournalSIAM Journal on Mathematical Analysis
Issue number6
Publication statusPublished - 1 Jan 2015


  • Gaussian measures
  • Kullback-Leibler divergence
  • Relative entropy

ASJC Scopus subject areas

  • Analysis
  • Computational Mathematics
  • Applied Mathematics


Dive into the research topics of 'Kullback-Leibler approximation for probability measures on infinite dimensional spaces'. Together they form a unique fingerprint.

Cite this