    Next: The Matrix of the Up: Specification of the Model Previous: Examples   Contents

### Recursive Representation

• In this section we will show
• how Markov chains can be constructed from sequences of independent and identically distributed random variables,
• that the recursive formulaê (9), (10), (11) and (13) are special cases of a general principle for the construction of Markov chains,
• that vice versa every Markov chain can be considered as solution of a recursive stochastic equation.
• As usual let be a finite (or countably infinite) set.
• Furthermore, let be a measurable space, e.g. could be the -dimensional Euclidian space and the Borel -algebra on , or could be defined as the unit interval and as the Borel -algebra on .
• Let now be a sequence of independent and identically distributed random variables mapping into , and let be independent of .
• Let the random variables be given by the stochastic recursion equation (14)

where is an arbitrary measurable function.

Theorem 2.2
• Let the random variables be given by .
• Then holds for any and such that .

Proof

• Formula (14) implies that       • where the last equality follows from the transformation theorem for independent and identically distributed random variables (see Theorem WR-3.18),
• as the random variables are functions of and hence independent of .
• In the same way one concludes that        Remarks

• The proof of Theorem 2.2 yields that the conditional probability is given by .
• does not dependent on , as the innovations'' are identically distributed.
• Moreover, the joint probability is given by (15)

where .
• Consequently, the sequence of random variables given by the recursive definition (14) is a Markov chain following the definition given in (3).

Our next step will be to show that vice versa, every Markov chain can be regarded as the solution of a recursive stochastic equation.

• Let be a Markov chain with state space , initial distribution and transition matrix .
• Based on a recursive equation of the form (14) we will construct a Markov chain with initial distribution and transition matrix such that (16)

for all :
1. We start with a sequence of independent random variables that are uniformly distributed on the interval .
2. First of all the -valued random variable is defined as follows: if and only if for all , i.e. (17)

3. The random variables are defined by the recursive equation (18)

where the function is given by (19)

• It is easy to see that the probabilities for the sequence defined by (17)-(18) are given by (3), i.e., is a Markov chain with initial distribution and transition matrix .

Remarks

• If (16) holds for two sequences and of random variables, these sequences are called stochastically equivalent.
• The construction principle (17)-(19) can be exploited for the Monte-Carlo simulation of Markov chains with given initial distribution and transition matrix.
• Markov chains on a countably infinite state space can be constructed and simulated in the same way. However, in this case (17)-(19) need to be modified by considering vectors and matrices of infinite dimensions.    Next: The Matrix of the Up: Specification of the Model Previous: Examples   Contents
Ursa Pantle 2006-07-20