Posterior consistency for partially observed Markov models
8th December 2017, 2:00 pm – 3:00 pm
Main Maths Building, SM3
We establish the posterior consistency for a parametrized family of partially observed, fully dominated Markov models. The prior is assumed to assign positive probability to all neighborhoods of the true parameter, for a distance induced by the expected Kullback-Leibler divergence between the family members' Markov transition densities. This assumption is easily checked in general. In addition, we show that the posterior consistency is implied by the consistency of the maximum likelihood estimator. The result is extended to possibly non-compact parameter spaces and non-stationary observations. Finally, we check our assumptions on a linear Gaussian model and a well-known stochastic volatility model.