Distilling importance sampling for likelihood-free inference
1st October 2021, 3:00 pm – 4:00 pm
Virtual Seminar, Zoom link: TBA
To be efficient, importance sampling requires selecting an accurate proposal distribution. This talk describes learning proposal distributions using optimisation over a flexible family of densities developed in machine learning: normalising flows. In a likelihood-free context, training data is generated by running ABC importance sampling with a large band-width parameter, and this is "distilled" by using it to train the normalising flow. Over many iterations of importance sampling and optimisation, the bandwidth is slowly reduced until an importance sampling proposal for a good ABC approximation to the posterior is generated.
The method will be demonstrated for likelihood-free inference on a queueing model. In this example we infer the parameters, and also the random variables used to simulate data. Thus we effectively learn to control the simulator to produce simulations closely matching the observations.