Nicolas Chopin and I have just resubmitted and re-arxived a (much) revised version of our manuscript on Expectation Propagation for Likelihood-Free Inference. Likelihood-free inference is what you end up doing when you have a stochastic model of some phenomenon, some data  on that phenomenon, and no way to compute the probability of getting the data  under the model. That actually happens quite a bit and can be a major pain in the neck. The solution is to use simulations from the model rather than a likelihood function, hence inference is “likelihood-free”. In the Bayesian world these methods are usually called something-ABC or ABC-something, where ABC stands for “Approximate Bayesian Computation”. The original ABC algorithm appears in Tavaré et al. (1997), in a somewhat hidden form, but since dozens and dozens of new variants on ABC have appeared.

Most of these are dog-slow. We show in the article that if the model has a certain factorising form, then using the Expectation-Propagation algorithm (Minka, 2001) we can do likelihood-free inference a lot faster.

New in this version:

• We show how to use composite likelihood to extend the class of models compatible with EP-ABC to just about any model.

• We have a new numerical example: an alpha-stable stochastic volatility model.

• We discuss how the algorithm behaves when faced with difficult posterior distributions, and what sort of work-arounds there are.

The MATLAB software package will be updated soon as well.

Advertisement