TY - JOUR

T1 - Bayesian computing with INLA

T2 - a review

AU - Rue, Havard

AU - Riebler, Andrea

AU - Sørbye, Sigrunn H.

AU - Illian, Janine B.

AU - Simpson, Daniel P.

AU - Lindgren, Finn K.

PY - 2017/3/7

Y1 - 2017/3/7

N2 - The key operation in Bayesian inference is to compute high-dimensional integrals. An old approximate technique is the Laplace method or approximation, which dates back to Pierre-Simon Laplace (1774). This simple idea approximates the integrand with a second-order Taylor expansion around the mode and computes the integral analytically. By developing a nested version of this classical idea, combined with modern numerical techniques for sparse matrices, we obtain the approach of integrated nested Laplace approximations (INLA) to do approximate Bayesian inference for latent Gaussian models (LGMs). LGMs represent an important model abstraction for Bayesian inference and include a large proportion of the statistical models used today. In this review, we discuss the reasons for the success of the INLA approach, the R-INLA package, why it is so accurate, why the approximations are very quick to compute, and why LGMs make such a useful concept for Bayesian computing.

AB - The key operation in Bayesian inference is to compute high-dimensional integrals. An old approximate technique is the Laplace method or approximation, which dates back to Pierre-Simon Laplace (1774). This simple idea approximates the integrand with a second-order Taylor expansion around the mode and computes the integral analytically. By developing a nested version of this classical idea, combined with modern numerical techniques for sparse matrices, we obtain the approach of integrated nested Laplace approximations (INLA) to do approximate Bayesian inference for latent Gaussian models (LGMs). LGMs represent an important model abstraction for Bayesian inference and include a large proportion of the statistical models used today. In this review, we discuss the reasons for the success of the INLA approach, the R-INLA package, why it is so accurate, why the approximations are very quick to compute, and why LGMs make such a useful concept for Bayesian computing.

KW - Approximate Bayesian inference

KW - Gaussian Markov random fields

KW - Laplace approximations

KW - Latent Gaussian models

KW - Numerical integration

KW - Sparse matrices

UR - http://www.scopus.com/inward/record.url?scp=85015210803&partnerID=8YFLogxK

UR - http://dx.doi.org/10.1146/annurev-statistics-060116-054045

U2 - 10.1146/annurev-statistics-060116-054045

DO - 10.1146/annurev-statistics-060116-054045

M3 - Review article

AN - SCOPUS:85015210803

SN - 2326-8298

VL - 4

SP - 395

EP - 421

JO - Annual Review of Statistics and Its Application

JF - Annual Review of Statistics and Its Application

ER -