Slice-Sampling Particle Belief Propagation

IEEE International Conference on Computer Vision (ICCV), December 2013.

Oliver Müller, Michael Ying Yang and Bodo Rosenhahn


Abstract

Inference in continuous label Markov random fields is a challenging task. We use particle belief propagation (PBP) for solving the maximum a-posteriori (MAP) problem in continuous label space. Sampling particles from the belief distribution is typically done by using Metropolis-Hastings (MH) Markov chain Monte Carlo (MCMC) methods which involves sampling from a proposal distribution. This proposal distribution has to be carefully designed depending on the particular model and input data to achieve fast convergence. We propose to avoid dependance on a proposal distribution by introducing a slice sampling based PBP algorithm. The proposed approach shows superior convergence performance on an image denoising toy example. Our findings are validated on a challenging relational 2D feature tracking application.


Code

You can download our library with example sources here


Project page

See our project page for further details about our relational feature tracking application.


ERC Starting Grants

This project has been partially funded by the ERC within the starting grant Dynamic MinVIP.