Variational Inference via Wasserstein gradient flows by Philippe Rigollet
Date: March 29, 2023
Speaker: Philippe Rigollet
Title: Variational Inference via Wasserstein gradient flows.
Abstract: The main quantities of interest in Bayesian inference are arguably the first two moments of the posterior distribution. In the past decades, variational inference (VI) has emerged as a tractable approach to approximate these summary statistics, and a viable alternative to the more established paradigm of Markov Chain Monte Carlo. Rather than sampling from the true posterior, VI aims at producing a simple but good approximation of the posterior for which summary statistics are easy to compute; for example, in this presentation, we consider the case where Gaussian (or mixture of Gaussians) approximations. This turns into an optimization problem over the space of Gaussian probability measures which we solve using the theory of gradient flows. Our theoretical results include convergence guarantees and a quantification of the approximation error. Based on