Scaling up Hybrid Probabilistic Inference with Logical and Arithmetic Constraints via Message Passing

Published at International Conference on Machine Learning (ICML), 2020

**Zhe Zeng**, Paolo Morettin, Fanqi Yan, Antonio Vergari, Guy Van den Broeck. http://starai.cs.ucla.edu/papers/ZengICML20.pdf

Abstract: Weighted model integration (WMI) is an appealing framework for probabilistic inference: it allows for expressing the complex dependencies in real-world problems, where variables are both continuous and discrete, via the language of Satisfiability Modulo Theories (SMT), as well as to compute probabilistic queries with complex logical and arithmetic constraints. Yet, existing WMI solvers are not ready to scale to these problems. They either ignore the intrinsic dependency structure of the problem entirely, or they are limited to overly restrictive structures. To narrow this gap, we derive a factorized WMI computation enabling us to devise a scalable WMI solver based on message passing, called MP-WMI. Namely, MP-WMI is the first WMI solver that can (i) perform exact inference on the full class of tree-structured WMI problems, and (ii) perform inter-query amortization, e.g., to compute all marginal densities simultaneously. Experimental results show that our solver dramatically outperforms the existing WMI solvers on a large set of benchmarks.