9 Citations (SciVal)


Inference in continuous label Markov random fields is a challenging task. We use particle belief propagation (PBP) for solving the inference problem in continuous label space. Sampling particles from the belief distribution is typically done by using Metropolis-Hastings (MH) Markov chain Monte Carlo (MCMC) methods which involves sampling from a proposal distribution. This proposal distribution has to be carefully designed depending on the particular model and input data to achieve fast convergence. We propose to avoid dependence on a proposal distribution by introducing a slice sampling based PBP algorithm. The proposed approach shows superior convergence performance on an image denoising toy example. Our findings are validated on a challenging relational 2D feature tracking application.
Original languageEnglish
Number of pages8
Publication statusPublished - 8 Dec 2013
Event2013 IEEE International Conference on Computer Vision (ICCV) - Sydney, Australia
Duration: 1 Dec 20138 Dec 2013


Conference2013 IEEE International Conference on Computer Vision (ICCV)

Cite this