### Massively parallel probabilistic inference and learning

Statistics Seminar

22nd March 2024, 2:00 pm – 3:00 pm

Fry Building, 2.41

Many probabilistic inference algorithms from importance sampling to reweighted wake-sleep (RWS) and importance weighted variational autoencoders (IWAE) fundamentally operate by drawing K samples from an proposal/approximate posterior, then reweighting those samples according to the true posterior. However, the number of samples required for reweighting to give accurate posterior estimates scales as e^n, where n is the number of latent variables. This is of course intractable in all but the smallest models. We propose to in effect get exponentially many samples by instead drawing K samples for each of the n latent variables, and considering all K^n combinations. While it again looks intractable to consider all K^n combinations, it turns out to be possible if we exploit conditional independencies in the model by using message-passing like algorithms. This gives a number of massively parallel variants of existing algorithms, and we implement these algorithms in "Alan", a new massively parallel probabilistic programming language.

*Organiser*: Juliette Unwin

## Comments are closed.