Deep Composition of Tensor Trains using Squared Inverse Rosenblatt Transports

Tiangang Cui, Sergey Dolgov

Research output: Contribution to journalArticle

Abstract

Characterising intractable high-dimensional random variables is one of the fundamental challenges in stochastic computation. The recent surge of transport maps offers a mathematical foundation and new insights for tackling this challenge by coupling intractable random
variables with tractable reference random variables. This paper generalises a recently developed
functional tensor-train (FTT) approximation of the inverse Rosenblatt transport [14] to a wide
class of high-dimensional nonnegative functions, such as unnormalised probability density functions. First, we extend the inverse Rosenblatt transform to enable the transport to general reference measures other than the uniform measure. We develop an efficient procedure to compute
this transport from a squared FTT decomposition which preserves the monotonicity. More crucially, we integrate the proposed monotonicity-preserving FTT transport into a nested variable
transformation framework inspired by deep neural networks. The resulting deep inverse Rosenblatt transport significantly expands the capability of tensor approximations and transport maps
to random variables with complicated nonlinear interactions and concentrated density functions.
We demonstrate the efficacy of the proposed approach on a range of applications in statistical
learning and uncertainty quantification, including parameter estimation for dynamical systems
and inverse problems constrained by partial differential equations.
Original languageEnglish
Number of pages47
JournalFoundations of Computational Mathematics
DOIs
Publication statusPublished - 21 Sep 2021

Fingerprint

Dive into the research topics of 'Deep Composition of Tensor Trains using Squared Inverse Rosenblatt Transports'. Together they form a unique fingerprint.

Cite this