PDF

Description

Languages for open-universe probability models (OUPMs) can represent situations with an unknown number of objects and identity uncertainty, which comprise a very important class of real-world applications. Current general-purpose inference methods for such languages are, however, much less efficient than those implemented for more restricted languages or for specific model classes. This paper goes some way to remedying the deficit by introducing, and proving correct, a general method for Gibbs sampling in partial worlds where model structure may vary across worlds. The method draws on and extends previous results on generic OUPM inference and on auxiliary-variable Gibbs sampling for non-parametric mixture models. It has been implemented for BLOG, a well-known OUPM language. Combined with compile-time optimizations, it yields very substantial speedups over existing methods on several test cases and substantially improves the practicality of OUPM languages generally.

Details

Files

Statistics

from
to
Export
Download Full History