### Polynomial time guarantees for sampling based posterior inference

Statistics Seminar

28th April 2023, 1:00 pm – 2:00 pm

Fry Building, 2.41

The Bayesian approach provides a flexible and popular framework for a wide range of non-parametric inference problems. It relies crucially on computing functionals with respect to the posterior distribution, such as the posterior mean or posterior quantiles for uncertainty quantification. In practice, this requires sampling from the posterior distribution using numerical algorithms, e.g., Markov chain Monte Carlo (MCMC) methods. The runtime of these algorithms to achieve a given target precision will typically, at least without additional structural assumptions such as strong log-concavity of the posterior, scale exponentially in the model dimension and the sample size. Concrete models without strongly log-concave posteriors are nonlinear inverse problems arising from partial differential equations or generalised linear models. In contrast, in this talk we show that sampling based posterior inference in a general high-dimensional setup is feasible. Given a sufficiently good initialiser, we present polynomial-time convergence guarantees for a widely used gradient based MCMC sampling scheme. The key idea is to combine posterior contraction with the local curvature induced by the Fisher-information of the statistical model near the underlying truth. We will discuss applications to high-dimensional logistic and Gaussian regression, as well as to density estimation.

*Organiser*: Juliette Unwin

## Comments are closed.