Approximate Bayesian inference using Expectation-Propagation (EP)

For applications where exact inference is not practical, we have developed and applied so-called approximate message passing methods for scalable inference and uncertainty quantification. Among other applications, using EP enables fast(er) inference for:

Variational autoencoders (VAEs) and Bayesian deep learning networks

For systems and operations that are effectively representable in lower-dimensional spaces, we have developed various VAE architectures for scalable inference and uncertainty quantification. Compared to other deep learning methods, VAEs demonstrate greater robustness to changes within the system and provide enhanced uncertainty quantification. Example applications include: