Monotonic Alpha-divergence Variational Inference
Statistics Seminar
1st October 2021, 4:00 pm – 5:00 pm
Virtual Seminar, Zoom link: TBA
Variational Inference methods have made it possible to construct fast algorithms for posterior distribution approximation. Yet, the theoretical results and empirical performances of Variational Inference methods are often impacted by two factors : one, an inappropriate choice of the objective function appearing in the optimisation problem and two, a search space that is too restrictive to match the target at the end of the optimisation procedure.
In this talk, we explore how we can remedy these two issues in order to build improved Variational Inference methods. More specifically, we suggest selecting the alpha-divergence as a more general class of objective functions and we propose several ways to enlarge the search space beyond the traditional framework used in Variational Inference. The specificity of our approach is that we derive numerically advantageous algorithms that provably ensure a systematic decrease in the alpha-divergence at each step. In addition, our framework allows us to unravel important connections with gradient-based schemes from the optimisation literature as well as an integrated EM algorithm from the importance sampling literature.
Comments are closed.