Oliver Müller, Michael Ying Yang and Bodo Rosenhahn
Inference in continuous label Markov random fields is a challenging task. We use particle belief propagation (PBP) for solving the maximum a-posteriori (MAP) problem in continuous label space. Sampling particles from the belief distribution is typically done by using Metropolis-Hastings (MH) Markov chain Monte Carlo (MCMC) methods which involves sampling from a proposal distribution. This proposal distribution has to be carefully designed depending on the particular model and input data to achieve fast convergence. We propose to avoid dependance on a proposal distribution by introducing a slice sampling based PBP algorithm. The proposed approach shows superior convergence performance on an image denoising toy example. Our findings are validated on a challenging relational 2D feature tracking application.
You can download our library with example sources here
See our project page for further details about our relational feature tracking application.