We present RxInfer.jl, which is a Julia package for automated and fast Bayesian inference in a probabilistic model through reactive message passing on a factor graph representation of that model. RxInfer.jl unites different Julia packages and forms a user-friendly ecosystem for efficient real-time Bayesian processing of infinite data streams.
Background: Bayesian inference realizes optimal information processing through a full commitment to reasoning by probability theory. The Bayesian framework is positioned at the core of modern AI technology for applications such as speech and image recognition and generation, medical analysis, robot navigation, etc., as it describes how a rational agent ought to update beliefs when new information is revealed by the agent's environment. Unfortunately, perfect Bayesian reasoning is generally intractable, since it requires the calculation of (often) very high-dimensional integrals. As a result, a number of numerical algorithms for approximating Bayesian inference have been developed and implemented in probabilistic programming packages. Successful methods include variants of Monte Carlo (MC) sampling, Variational Inference (VI), and Laplace approximation.
Problem statement: Many important AI applications, such as self-driving vehicles and extended reality video processing, require real-time Bayesian inference. However, sampling-based inference methods do not scale well to realistic probabilistic models with a significant number of latent states. As a result, Monte Carlo sampling-based methods are not suitable for real-time applications. Variational Inference promises to scale better than sampling-based inference, but VI requires derivation of gradients of a "variational Free Energy" cost function. For large models, manual derivation of these gradients is not feasible, and automated "black-box" gradient methods are too inefficient to be applicable to real-time inference applications. Therefore, while Bayesian inference is known as the optimal data processing framework, in practice, real-time AI applications rely on much simpler, often ad hoc, data processing algorithms.
Solution proposal: We present RxInfer.jl, a package for processing infinite data streams by real-time Bayesian inference in large probabilistic models. RxInfer is open source, available at http://rxinfer.ml, and enjoys the following features:
Evaluation: Over the past few years, the ecosystem has been tested on many advanced probabilistic models that have led to several publications in high-ranked journals such as Entropy [1], Frontiers [2], and conferences like MLSP[3], ISIT[4], PGM[5] and others. A fast and user-friendly automatic Bayesian inference framework is a key factor to expand the applicability of Bayesian inference methods to real-time AI applications. More generally, access to fast Bayesian inference will benefit the wider statistical research community. We are excited to present our work on RxInfer and discuss its strengths and limitations.
References: More at biaslab.github.io/publication/.
[1] Message Passing and Local Constraint Manipulation in Factor Graphs, Entropy. Special Issue on Approximate Bayesian Inference.
[2] AIDA: An Active Inference-Based Design Agent for Audio Processing Algorithms, Frontiers in Signal Processing.
[3] Message Passing-Based Inference in the Gamma Mixture Model, IEEE 31st International Workshop on Machine Learning for Signal Processing.
[4] The Switching Hierarchical Gaussian Filter, IEEE International Symposium on Information Theory.
[5] Online Single-Microphone Source Separation using Non-Linear Autoregressive Models, PGM