In general, the naive importance sampling (IS) estimator does not work well in examples involving simultaneous inference on several targets, because the importance weights can take arbitrarily large values, making the estimator highly unstable. In such situations, researchers prefer alternative multiple IS estimators involving samples from multiple proposal distributions. Just like the naive IS, the success of these multiple IS estimators depends crucially on the choice of the proposal distributions, which is the focus of this study. We propose three methods: (i) a geometric space-filling approach, (ii) a minimax variance approach, and (iii) a maximum entropy approach. The first two methods apply to any IS estimator, whereas the third approach is described in the context of a two-stage IS estimator. For the first method, we propose a suitable measure of "closeness" based on the symmetric Kullback-Leibler divergence and the second and third approaches use estimates of asymptotic variances of an IS estimator and the reverse logistic regression estimator, respectively. Thus, when samples from the proposal distributions are obtained by running Markov chains, we provide consistent spectral variance estimators for these asymptotic variances. Lastly, we demonstrate the proposed methods for selecting proposal densities using various detailed examples.

Original languageEnglish
Pages (from-to)27-46
Number of pages20
JournalStatistica Sinica
Issue number1
Publication statusPublished - 31 Jan 2024


  • Bayes factor
  • Markov chain
  • central limit theorem
  • marginal likelihood
  • polynomial ergodicity
  • reverse logistic regression

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Selection of proposal distributions for multiple importance sampling'. Together they form a unique fingerprint.

Cite this