next up previous contents
Next: Error Analysis for MCMC Up: Simulation Methods Based on Previous: Gibbs Sampler   Contents


Metropolis-Hastings Algorithm


Remarks
 


Theorem 3.14   $ \;$ The transition matrix $ {\mathbf{P}}=(p_{{\mathbf{x}}{\mathbf{x}}^\prime})$ defined by % latex2html id marker 37937
$ (\ref{ube.mat.met})$- % latex2html id marker 37939
$ (\ref{bed.mat.ess})$ is irreducible and aperiodic and the pair $ ({\mathbf{P}},{\boldsymbol{\pi}})$ is reversible.

Proof
 


Examples
 
  1. Metropolis Algorithm 
    • The classic Metropolis algorithm is obtained if we consider equality in (50), i.e. if

      $\displaystyle s_{{\mathbf{x}}{\mathbf{x}}^\prime}=
1+\min\,\bigl\{t_{{\mathbf{x...
...athbf{x}}}\bigr\}\,,
\qquad\forall\,{\mathbf{x}},\,{\mathbf{x}}^\prime\in E\,.
$

    • In this case the acceptance probabilities $ a_{{\mathbf{x}}{\mathbf{x}}^\prime}$ for arbitrary $ {\mathbf{x}},\,{\mathbf{x}}^\prime\in E$ such that $ q_{{\mathbf{x}}{\mathbf{x}}^\prime}>0$ are of the following form:
      $\displaystyle a_{{\mathbf{x}}{\mathbf{x}}^\prime}$ $\displaystyle =$ $\displaystyle \frac{1+\min\,\bigl\{t_{{\mathbf{x}}{\mathbf{x}}^\prime}\,,t_{{\mathbf{x}}^\prime{\mathbf{x}}}
\bigr\}}{1+t_{{\mathbf{x}}^\prime{\mathbf{x}}}}$  
        $\displaystyle =$ $\displaystyle \frac{\min\,\bigl\{1+t_{{\mathbf{x}}{\mathbf{x}}^\prime}\,,\,1+t_...
...mathbf{x}}^\prime{\mathbf{x}}}}{
1+t_{{\mathbf{x}}{\mathbf{x}}^\prime}}\Biggr\}$  
        $\displaystyle =$ $\displaystyle \min\Biggl\{1,\;\frac{\pi_{{\mathbf{x}}^\prime}q_{{\mathbf{x}}^\p...
...{\mathbf{x}}}}{
\pi_{\mathbf{x}}q_{{\mathbf{x}}{\mathbf{x}}^\prime}}\Biggr\}\,,$  

      i.e.

      $\displaystyle a_{{\mathbf{x}}{\mathbf{x}}^\prime}=\min\Biggl\{1,\;\frac{\pi_{{\...
...x}}^\prime}}\Biggr\}\,,\qquad\forall\,{\mathbf{x}},\,{\mathbf{x}}^\prime\in E\;$$\displaystyle \mbox{such that $q_{{\mathbf{x}}{\mathbf{x}}^\prime}>0$.}$ (52)

    • If the matrix $ {\mathbf{Q}}=(q_{{\mathbf{x}}{\mathbf{x}}^\prime})$ of the ,,potential'' transition probabilities is symmetric, then (52) implies

      $\displaystyle a_{{\mathbf{x}}{\mathbf{x}}^\prime}=\min\Bigl\{1,\;\frac{\pi_{{\m...
...{\mathbf{x}}}\Bigr\}\,,\qquad\forall\,{\mathbf{x}},\,{\mathbf{x}}^\prime\in E\;$$\displaystyle \mbox{such that $q_{{\mathbf{x}}{\mathbf{x}}^\prime}>0$.}$ (53)

    • In particular, if the ,,potential'' updates $ {\mathbf{x}}\longrightarrow{\mathbf{x}}^\prime$ are chosen ,,randomly'', i.e. if

      $\displaystyle q_{{\mathbf{x}}{\mathbf{x}}^\prime}=\frac{1}{\vert E\vert}\;,\qquad\forall\,{\mathbf{x}},\,{\mathbf{x}}^\prime\in
E\,,
$

      then the acceptance probabilities $ a_{{\mathbf{x}}{\mathbf{x}}^\prime}$ are also given by (53).


  2. Barker Algorithm 
    • The so-called Barker algorithm is obtained if we consider the matrix $ {\mathbf{S}}=\bigl(s_{{\mathbf{x}}{\mathbf{x}}^\prime}\bigr)$ where $ s_{{\mathbf{x}}{\mathbf{x}}^\prime}=1$ for arbitrary $ {\mathbf{x}},\,{\mathbf{x}}^\prime\in E$.
    • The acceptance probabilities $ a_{{\mathbf{x}}{\mathbf{x}}^\prime}$ are then given by

      $\displaystyle a_{{\mathbf{x}}{\mathbf{x}}^\prime}= \frac{\pi_{{\mathbf{x}}^\pri...
...\mathbf{x}}^\prime}}\;,\qquad\forall\,{\mathbf{x}},\,{\mathbf{x}}^\prime\in E\;$$\displaystyle \mbox{such that $q_{{\mathbf{x}}{\mathbf{x}}^\prime}>0$.}$ (54)

    • If the matrix $ {\mathbf{Q}}=(q_{{\mathbf{x}}{\mathbf{x}}^\prime})$ of ,,potential'' transition probabilities is symmetric, then

      $\displaystyle a_{{\mathbf{x}}{\mathbf{x}}^\prime}= \frac{\pi_{{\mathbf{x}}^\pri...
... + \pi_{\mathbf{x}}}\;,\qquad\forall\,{\mathbf{x}},\,{\mathbf{x}}^\prime\in E\;$$\displaystyle \mbox{such that $q_{{\mathbf{x}}{\mathbf{x}}^\prime}>0$.}$ (55)


MCMC Simulation Algorithm
 



next up previous contents
Next: Error Analysis for MCMC Up: Simulation Methods Based on Previous: Gibbs Sampler   Contents
Ursa Pantle 2006-07-20