by Peter Green (Bristol, UK) and Duncan Murdoch (Queens University, Canada)

in

There are now methods for organising a Markov chain Monte Carlo simulation so that it can be guaranteed that the state of the process at a given time is

With the ultimate objective of Bayesian MCMC with guaranteed
convergence, the purpose of this paper is to describe recent efforts to
construct exact sampling methods for continuous-state Markov chains. We
review existing methods based on gamma-coupling and rejection sampling
(Murdoch and Green, *Scandinavian Journal of Statistics*, 1998),
that are quite straightforward to understand, but require a closed form
for the transition kernel and entail cumbersome algebraic manipulation.
We then introduce two new methods based on random walk Metropolis, that
offer the prospect of more automatic use, not least because the
difficult, continuous, part of the transition mechanism can be coupled
in a generic way, using a proposal distribution of convenience.

One of the methods is based on a neat decomposition of any unimodal (multivariate) symmetric density into pieces that may be re-assembled to construct any translated copy of itself: that allows coupling of a continuum of Metropolis proposals to a finite set, at least for a compact state space. We discuss methods for economically coupling the subsequent accept/reject decisions.

Our second new method deals with unbounded state spaces, using a trick due to W. S. Kendall of running a coupled dominating process in parallel with the sample paths of interest. The random subset of the state space below the dominating path is compact, allowing efficient coupling and coalescence.

We look towards the possibility that application of such methods could become sufficiently convenient that they could become the basis for routine Bayesian computation in the foreseeable future.

Back to Peter Green's research page