A Comparative Review of Dimension Reduction Methods in Approximate Bayesian Computation

M. G. B. Blum, M. A. Nunes, D. Prangle, S. A. Sisson

Research output: Contribution to journalArticle

140 Citations (Scopus)

Abstract

Approximate Bayesian computation (ABC) methods make use of comparisons between simulated and observed summary statistics to overcome the problem of computationally intractable likelihood functions. As the practical implementation of ABC requires computations based on vectors of summary statistics, rather than full data sets, a central question is how to derive low-dimensional summary statistics from the observed data with minimal loss of information. In this article we provide a comprehensive review and comparison of the performance of the principal methods of dimension reduction proposed in the ABC literature. The methods are split into three nonmutually exclusive classes consisting of best subset selection methods, projection techniques and regularization. In addition, we introduce two new methods of dimension reduction. The first is a best subset selection method based on Akaike and Bayesian information criteria, and the second uses ridge regression as a regularization procedure. We illustrate the performance of these dimension reduction techniques through the analysis of three challenging models and data sets.
Original languageEnglish
Pages (from-to)189-208
Number of pages20
JournalStatistical Science
Volume28
Issue number2
DOIs
Publication statusPublished - 1 May 2013

Fingerprint

Bayesian Computation
Dimension Reduction
Reduction Method
Subset Selection
Statistics
Regularization
Bayesian Information Criterion
Ridge Regression
Akaike Information Criterion
Likelihood Function
Projection Method
Review
Dimension reduction

Cite this

A Comparative Review of Dimension Reduction Methods in Approximate Bayesian Computation. / Blum, M. G. B.; Nunes, M. A.; Prangle, D.; Sisson, S. A.

In: Statistical Science, Vol. 28, No. 2, 01.05.2013, p. 189-208.

Research output: Contribution to journalArticle

Blum, M. G. B. ; Nunes, M. A. ; Prangle, D. ; Sisson, S. A. / A Comparative Review of Dimension Reduction Methods in Approximate Bayesian Computation. In: Statistical Science. 2013 ; Vol. 28, No. 2. pp. 189-208.
@article{5292d428a97748a798619f76d2dd278d,
title = "A Comparative Review of Dimension Reduction Methods in Approximate Bayesian Computation",
abstract = "Approximate Bayesian computation (ABC) methods make use of comparisons between simulated and observed summary statistics to overcome the problem of computationally intractable likelihood functions. As the practical implementation of ABC requires computations based on vectors of summary statistics, rather than full data sets, a central question is how to derive low-dimensional summary statistics from the observed data with minimal loss of information. In this article we provide a comprehensive review and comparison of the performance of the principal methods of dimension reduction proposed in the ABC literature. The methods are split into three nonmutually exclusive classes consisting of best subset selection methods, projection techniques and regularization. In addition, we introduce two new methods of dimension reduction. The first is a best subset selection method based on Akaike and Bayesian information criteria, and the second uses ridge regression as a regularization procedure. We illustrate the performance of these dimension reduction techniques through the analysis of three challenging models and data sets.",
author = "Blum, {M. G. B.} and Nunes, {M. A.} and D. Prangle and Sisson, {S. A.}",
year = "2013",
month = "5",
day = "1",
doi = "10.1214/12-STS406",
language = "English",
volume = "28",
pages = "189--208",
journal = "Statistical Science",
issn = "0883-4237",
publisher = "Institute of Mathematical Statistics",
number = "2",

}

TY - JOUR

T1 - A Comparative Review of Dimension Reduction Methods in Approximate Bayesian Computation

AU - Blum, M. G. B.

AU - Nunes, M. A.

AU - Prangle, D.

AU - Sisson, S. A.

PY - 2013/5/1

Y1 - 2013/5/1

N2 - Approximate Bayesian computation (ABC) methods make use of comparisons between simulated and observed summary statistics to overcome the problem of computationally intractable likelihood functions. As the practical implementation of ABC requires computations based on vectors of summary statistics, rather than full data sets, a central question is how to derive low-dimensional summary statistics from the observed data with minimal loss of information. In this article we provide a comprehensive review and comparison of the performance of the principal methods of dimension reduction proposed in the ABC literature. The methods are split into three nonmutually exclusive classes consisting of best subset selection methods, projection techniques and regularization. In addition, we introduce two new methods of dimension reduction. The first is a best subset selection method based on Akaike and Bayesian information criteria, and the second uses ridge regression as a regularization procedure. We illustrate the performance of these dimension reduction techniques through the analysis of three challenging models and data sets.

AB - Approximate Bayesian computation (ABC) methods make use of comparisons between simulated and observed summary statistics to overcome the problem of computationally intractable likelihood functions. As the practical implementation of ABC requires computations based on vectors of summary statistics, rather than full data sets, a central question is how to derive low-dimensional summary statistics from the observed data with minimal loss of information. In this article we provide a comprehensive review and comparison of the performance of the principal methods of dimension reduction proposed in the ABC literature. The methods are split into three nonmutually exclusive classes consisting of best subset selection methods, projection techniques and regularization. In addition, we introduce two new methods of dimension reduction. The first is a best subset selection method based on Akaike and Bayesian information criteria, and the second uses ridge regression as a regularization procedure. We illustrate the performance of these dimension reduction techniques through the analysis of three challenging models and data sets.

U2 - 10.1214/12-STS406

DO - 10.1214/12-STS406

M3 - Article

VL - 28

SP - 189

EP - 208

JO - Statistical Science

JF - Statistical Science

SN - 0883-4237

IS - 2

ER -