Statistics Seminars: Informed sub-sampling MCMC: Approximate Bayesian inference for large datasets
28 January 2019 12:00 in CM221
This talk introduces a framework for speeding up Bayesian inference for large datasets. We
design a Markov chain whose transition kernel uses an (unknown) fraction of (fixed size) of
the available data that is randomly refreshed throughout the algorithm. Inspired by the
Approximate Bayesian Computation (ABC) literature, the subsampling process is guided by
the fidelity to the observed data, as measured by summary statistics. The resulting
algorithm, Informed Sub-Sampling MCMC (ISS-MCMC), is a generic and flexible approach
which, contrary to existing scalable methodologies, preserves the simplicity of the
Metropolis-Hastings algorithm. Even though exactness is lost, i.e. the chain distribution
approximates the posterior, we study and quantify theoretically this bias and show on a
diverse set of examples that it yields excellent performances when the computational
budget is limited.
This is joint work with Florian Maire (Montreal) and Pierre Alquier (INSAE, Paris).
Maire, F., Friel, N. & Alquier, P. Stat Comput (2018). https://doi.org/10.1007/s11222-018-9817-3
Contact email@example.com for more information