Next: Metropolis-Hastings Algorithm Up: Simulation Methods Based on Previous: Example: Hard-Core Model   Contents

### Gibbs Sampler

The MCMC algorithm for the generation of ,,randomly picked'' admissible configurations of the hard core model (see Section 3.3.1) is a special case of a so-called Gibbs sampler for the simulation of discrete (high dimensional) random vectors.

• Let be a finite (nonempty) index set and let be a discrete random vector
• taking values in the finite state space with probability where we assume
• that for every pair there is a finite sequence of states such that

 and (34)

• Let be the probability function of the random vector with for all , and for all let

 (35)

• denote the conditional probability that the component of has the value
• given that the vector of the other components equals where we assume .

MCMC Simulation Algorithm

• Similar to Section 3.3.1 we construct a Markov chain
• with state space and an (appropriately chosen) irreducible and aperiodic transition matrix ,
• such that is the ergodic limit distribution of .
• Then we generate a ,,path'' of the Markov chain by the recursive construction discussed in Section 2.1.3:

1. Pick an initial state .
2. Pick a component according to a given probability function such that for all .
3. Generate the update of the th component according to the (conditional) probability function

4. The values of all components are not changed, i.e. for all .

Theorem 3.11   Let the transition matrix be given as

 (36)

where the conditional probabilities are defined in . Then is irreducible and aperiodic and the pair is reversible.

Proof
The assertion can be proved similarly to the proof of Theorem 3.10.
• In order to see that is aperiodic it suffices to notice
• that for all

• and hence all diagonal elements of are positive.
• The following considerations show that is irreducible.
• For arbitrary but fixed let be the number of components such that .
• For , i.e. , we already showed while proving the aperiodicity that .
• Let now . Without loss of generality we may assume that the components are linearly ordered and that the first components of and differ.
• By hypothesis (34) the state space contains a sequence such that and

• Moreover, for each

 (37)

and thus .
• It is left to show that the detailed balance equation (2.85) holds, i.e.

 (38)

• If , then (38) obviously holds.
• If for more than one component , then and hence (38) holds.
• Let now for exactly one (and hence for all ). Then (37) implies

Let be a Markov chain with state space and the transition matrix given by (36). As a consequence of Theorem 3.11 we get that in this case

 (39)

for any initial concentration where denotes the distribution of . Furthermore, the Gibbs sampler shows the following monotonic behavior.

Theorem 3.12   For all ,

 (40)

Proof

• For arbitrary and , formula (35) implies
 (41)

• Using this and the definition (36) of the transition matrix we obtain

Remarks

• A modified version of the Gibbs sampler that was considered in this section is the so-called cyclic Gibbs sampler, which uses a different procedure for picking the component that will be updated.
• Namely, it is not chosen according to a (given) probability function , where for all ,
• but the components are sorted linearly and chosen one after another according to this order. The selection of the update candidates thus becomes a deterministic procedure.
• If for some numbers and , then the matrix of the transition probabilities in step is given as

 (42)

• For an entire (scan) cycle, updating each component exactly once, one obtains the following transition matrix

 (43)

• It is easy to show that the matrix given by (42) and (43)
• is irreducible and aperiodic
• and that is the stationary (limit) distribution of as
• for all and for all formulae (41) and (42) imply that

• and hence also

• The pair is in general not reversible. However, in Section 2.3.4 we showed that the pair is reversible where

 for (44)

denotes the multiplicative reversible version of .

Theorem 3.13   The matrix has the following representation

 (45)

i.e., the multiplicative reversible version of the ,,forward-scan matrix'' coincides with the ,,forward-backward scan matrix''.

Proof

• It suffices to show that for the matrix defined by (44).
• Formulae (42)-(44) imply for arbitrary

• This and (35) yield (similar to the proof of (38))

Remarks

• If Gibbs samplers are used in practice it is always assumed
• that the conditional probabilities considered in (36) and (42)

only depend on the vector of the values
• obtained by the random vector in a certain small neighborhood of .
• The family of subsets of is called a system of neighborhoods if for arbitrary
(a)
,
(b)
implies .
• For the hard-core model from Section 3.3.1, is the set of those vertices that are directly connected to by an edge.

Next: Metropolis-Hastings Algorithm Up: Simulation Methods Based on Previous: Example: Hard-Core Model   Contents
Ursa Pantle 2006-07-20