Stats4Grads: Hamiltonian Monte Carlo and its variants
21 January 2015 13:00 in CM105
Hamiltonian Monte Carlo (HMC), also known as Hybrid Monte Carlo, is one of the Markov Chain Monte Carlo (MCMC) sampling methods which offer different strategies to generate a sequence of correlated samples converging to the desired distribution. In many situations, especially Bayesian statistics, target distributions usually have complicated forms, high correlated parameters and large dimension size. Traditional MCMC methods, such as random-walk Metropolis Hasting and Gibbs sampling, might have slow exploration of state space and low accepted rate caused by both random walk behaviour of traditional MCMC methods and the complex nature of target distributions. HMC is a new sampling algorithm which tries to avoid these problems by taking several steps according to gradient information of target distribution. This makes HMC have remote proposals and converge quicker than traditional random walk methods. Although the demonstrated ability of HMC sampling to overcome random walks in MCMC sampling suggests that it should be a highly successful tool for Bayesian inference, its performances depend on its algorithm parameters. Three HMC variants that provides automatically tuning will be discussed in the talk.
Contact firstname.lastname@example.org for more information
See the Stats4Grads page for more details about this series.