Next: Asymptotic Variance of Estimation; Up: Error Analysis for MCMC Previous: Estimate for the Rate   Contents

### MCMC Estimators; Bias and Fundamental Matrix

In this section we will investigate the characteristics of Monte-Carlo estimators for expectations.

• Examples for similar problems were already discussed in Section 3.1.1,
• when we estimated by statistical means
• and the value of integrals via Monte-Carlo simulation.
• However, for these purposes we assumed
• that the pseudo-random numbers can be regarded as realizations of independent and identically distributed sampling variables.
• In the present section we assume that the sample variables form an (appropriately chosen) Markov chain.
• This is the reason why these estimators are called Markov-Chain-Monte-Carlo estimators (MCMC estimators).

Statistical Model

• Let be a finite (nonempty) index set and let be a discrete random vector,
• taking values in the finite state space with probability ,
• where is identified with the set of the first natural numbers.
• Furthermore, we assume for all where denotes the probability function of the random vector .
• Our goal
• is to estimate the expectation via MCMC simulation where

 (69)

• and is an arbitrary but fixed function.
• As an estimator for we consider the random variable

 (70)

• where is a Markov chain with state space , arbitrary but fixed initial distribution and
• an irreducible and aperiodic transition matrix , such that is the ergodic limit distribution with respect to .

Remarks

• Typically, the initial distribution does not coincide with the simulated distribution .
• Consequently, the MCMC estimator defined by (70) is not unbiased for fixed (finite) sample size,
• i.e. in general for all .
• For determining the bias the following representation formula will be helpful.

Theorem 3.17   For all ,

 (71)

Proof

• In Theorem 2.3 we proved that for all the distribution of is given by .
• Thus, by definition (70) of the MCMC estimator , we get that

Remarks

• As an immediate consequence of Theorem 3.17, the ergodicity of the transition matrix , and (69), one obtains

• i.e., the MCMC estimator for defined in (70) is asymptotically unbiased.

Apart from this, the asymptotic behavior of for can be determined. For this purpose we need the following two lemmata.

Lemma 3.2   Let be the matrix consisting of the identical row vectors . Then

 (72)

for all and in particular

 (73)

Proof

• Evidently, (72) holds for .
• If we assume that (72) holds for some , then

where the last equality follows from the fact that

and thus

• This proves (72) for all .
• As is assumed to be irreducible and aperiodic,
• by Theorems 2.4 and 2.9 we get that if .
• Thus, by (72), also if .

Remarks

• By the zero convergence for in Lemma 3.2 and Lemma 2.4, the matrix is invertible.
• In order to show this it suffices to consider the matrix in Lemma 2.4.
• The inverse matrix

 (74)

is hence well defined. It is called the fundamental matrix of .

Lemma 3.3   The fundamental matrix of the irreducible and aperiodic transition matrix has the representation formulae

 (75)

and

 (76)

Proof

• Formula (75) follows from Lemmas 2.4 and  3.2 as for

• In order to show (76) it suffices to notice that

and that the last expression converges to for .
• The zero convergence is due to the fact that for every matrix

and thus for

Theorem 3.17 and Lemma 3.3 enable us to give a more detailed description of the asymptotic behavior of the bias .

Theorem 3.18
• Let where denotes the fundamental matrix of that was introduced by .
• Then, for all ,

 (77)

where is a remainder such that for .

Proof

• The representation formula (75) in Lemma 3.3 yields

• Hence by taking into account Theorem 3.17 we obtain the following for a certain sequence such that :

Next: Asymptotic Variance of Estimation; Up: Error Analysis for MCMC Previous: Estimate for the Rate   Contents
Ursa Pantle 2006-07-20