This impromptu talk was presented to introduce the basics of the Markov Chain Monte Carlo technique, which is being increasing used in Bayesian analysis. The aim of MCMC is to produce a sequence of parameter vectors that represent random draws from a probability density function (pdf). The pdf of interest in Bayesian analysis is typically the posterior distribution. Because of its simplicity, the most often used MCMC technique is the Metropolis algorithm. At any point in the sequence, it consists of taking random steps away from the present point to a new trial location. This new location is accepted or rejected on the basis of its probability relative to the previous one. Issues of burn in and convergence are introduced. The efficiency of the MCMC technique is determined by estimating the autocorrelation of each parameter. More advanced algorithms include the Metropolis-Hastings, Langevin, and Hamiltonian-hybrid algorithms.
Keywords: Markov chain Monte Carlo, Metropolis algorithm, Metropolis-Hastings, Hamiltonian hybrid technique, MCMC convergence, burn in, MCMC efficiency, autocorrelation
Viewgraphs for this talk (pdf, 187 KB)
Return to presentation list
Send e-mail to author at email@example.com