Parallelizing particle filters with butterfly interactions

Kari Heine, Nick Whiteley, Ali Taylan Cemgil

Research output: Contribution to journalArticlepeer-review

6 Citations (SciVal)
83 Downloads (Pure)

Abstract

The bootstrap particle filter (BPF) is the cornerstone of many algorithms used for solving generally intractable inference problems with hidden Markov models. The long-term stability of the BPF arises from particle interactions that typically make parallel implementations of the BPF nontrivial. We propose a method whereby particle interaction is done in several stages. With the proposed method, full interaction can be accomplished even if we allow only pairwise communications between processing elements at each stage. We show that our method preserves the consistency and the long-term stability of the BPF, although our analysis suggests that the constraints on the stagewise interactions introduce errors leading to a lower convergence rate than standard Monte Carlo. The proposed method also suggests a new, more flexible, adaptive resampling scheme, which, according to our numerical experiments, is the method of choice, displaying a notable gain in efficiency in certain parallel computing scenarios.

Original languageEnglish
Pages (from-to)361-396
Number of pages36
JournalScandinavian Journal of Statistics
Volume47
Issue number2
Early online date16 Aug 2019
DOIs
Publication statusPublished - 1 Jun 2020

Bibliographical note

Funding Information:
This research made use of the Balena High Performance Computing Service at the University of Bath.

Publisher Copyright:
© 2019 Board of the Foundation of the Scandinavian Journal of Statistics

Keywords

  • hidden Markov model
  • parallelism
  • particle filter
  • particle interaction
  • sequential Monte Carlo

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Parallelizing particle filters with butterfly interactions'. Together they form a unique fingerprint.

Cite this