next up previous contents
Next: Direct and Iterative Computation Up: Ergodicity and Stationarity Previous: Irreducible and Aperiodic Markov   Contents


Stationary Initial Distributions

Theorem 2.10    

A proof of Theorem 2.10 can be found in Chapter 7 of E. Behrends (2000) Introduction to Markov Chains, Vieweg, Braunschweig.


Remarks
 


Definition
 


Theorem 2.11    

Proof
 


Remarks
 

Examples
 
  1. Queues


    see. T. Rolski, H. Schmidli, V. Schmidt, J. Teugels (2002) Stochastic Processes for Insurance and Finance. J. Wiley & Sons, Chichester, S. 147 ff.


    • We consider the example already discussed in Section 2.1.2
      • of the recursively defined Markov chain $ X_0,X_1,\ldots\Omega\to\{0,1,\ldots\}$ with $ X_0=0$ and

        $\displaystyle X_n= \max\{0,X_{n-1}+Z_n-1\}\,,\qquad\forall\,n\ge 1\,,$ (63)

      • where the random variables $ Z,Z_1,Z_2,\ldots:\Omega\to\{0,1,\ldots\}$ are independent and identically distributed and the transition matrix $ {\mathbf{P}}=(p_{ij})$ is given by

        $\displaystyle p_{ij}=\left\{\begin{array}{ll} P(Z=j+1-i) &\mbox{if $j+1\ge i>0$...
...\  P(Z=0)+P(Z=1) &\mbox{if $j=i=0$,}\\  0 &\mbox{otherwise.} \end{array}\right.$ (64)

    • It is not difficult to show that
      • the Markov chain $ \{X_n\}$ defined by the recursion formula (63) with its corresponding transition matrix (64) is irreducible and aperiodic if

        $\displaystyle P(Z=0)>0\,,\quad P(Z=1)>0$   and$\displaystyle \quad P(Z=2)>0\,,$ (65)

      • for all $ n\ge 1$ the solution of the recursion equation (63) can be written as

        $\displaystyle X_n=\max\Bigl\{0,\max\limits_{k\in\{1,\ldots,n\}}\sum\limits_{r=k...
...ax\Bigl\{0,\max\limits_{k\in\{1,\ldots,n\}}\sum\limits_{r=1}^k(Z_r-1)\Bigr\}\,,$ (66)

      • the limit probabilities $ \pi_i$ exist for all $ i\in\{0,1,\ldots\}$ where
        $\displaystyle \pi_i$ $\displaystyle =$ $\displaystyle \lim\limits_{n\to\infty}
P\Bigl(\max\Bigl\{0,\max\limits_{k\in\{1,\ldots,n\}}
\sum\limits_{r=1}^k(Z_r-1)\Bigr\}=i\Bigr)$  
          $\displaystyle =$ $\displaystyle \left\{\begin{array}{ll}
P\Bigl(\sup\limits_{k\in\{1,2,\ldots\}}
...
...}}
\sum\limits_{r=1}^k(Z_r-1)\le 0\Bigr) & \mbox{for $i=0$.}
\end{array}\right.$  

    • Furthermore

      % latex2html id marker 33540
$\displaystyle \begin{array}{lll} \pi_i=0 &\mbox{fo...
...=1$} &\mbox{if (\ref{bed.irre.ape}) holds and ${\mathbb{E}\,}Z<1$.}
\end{array}$

    • Thus, for Markov chains with (countably) infinite state space,
      • irreducibility and aperiodicity do not always imply ergodicity,
      • but, additionally, a certain contraction condition needs to be satisfied,
      • where in the present example this condition is the requirement of a negative drift , i.e.,
        $ {\mathbb{E}\,}(Z-1)<0$.
    • If the conditions (65) are satisfied and $ {\mathbb{E}\,}
Z<1$, then
      • the equation $ {\boldsymbol{\alpha}}^\top={\boldsymbol{\alpha}}^\top{\mathbf{P}}$ has a uniquely determined probability solution $ {\boldsymbol{\alpha}}^\top=(\alpha_0,\alpha_1,\ldots)$,
      • which coincides with $ {\boldsymbol{\pi}}^\top=(\pi_0,\pi_1,\ldots)$ $ (=\lim_{n\to\infty}{\boldsymbol{\alpha}}_n^\top)$ but which in general cannot be determined explicitly.
      • However, there is a simple formula for the generating function $ g_{\boldsymbol{\pi}}:(-1,1)\to[0,1]$ of $ {\boldsymbol{\pi}}=(\pi_0,\pi_1,\ldots)^\top$, where

        $\displaystyle g_{\boldsymbol{\pi}}(s)=\sum\limits_{i=0}^\infty s^i\pi_i \qquad\Biggl(={\mathbb{E}\,}
s^{X_\infty}\Biggr)
$

        and

        $\displaystyle X_\infty=\max\Bigl\{0,\sup\limits_{k\in\{1,2,\ldots\}}\sum\limits_{r=1}^k(Z_r-1)\Bigr\}.$ (67)

      • Namely, we have

        $\displaystyle g_{\boldsymbol{\pi}}(s)=\frac{(1-\rho)(1-s)}{g_Z(s)-s}\;,\qquad\forall \, s\in(-1,1)\,,$ (68)

        where $ \rho={\mathbb{E}\,}Z$ and $ g_Z(s)={\mathbb{E}\,}s^Z$ is the generating function of $ Z$.


    • Proof of (68)
      • By the defibition (67) of $ X_\infty$, we have $ X_\infty\stackrel{{\rm d}}{=}\max\{0,X_\infty+(Z-1)\}$.
      • Furthermore, using the notation $ x_+=\max\{0,x\}$, we obtain
        $\displaystyle g_{\boldsymbol{\pi}}(s)$ $\displaystyle =$ $\displaystyle {\mathbb{E}\,}s^{X_\infty}={\mathbb{E}\,}s^{(X_\infty+Z-1)_+}$  
          $\displaystyle =$ $\displaystyle {\mathbb{E}\,}\Bigl(s^{ (X_\infty+Z-1)_+}\; {1\hspace{-1mm}{\rm I...
...}\,}\Bigl(s^{ (X_\infty+Z-1)_+}\;{1\hspace{-1mm}{\rm I}}(X_\infty+Z-1=-1)\Bigr)$  
          $\displaystyle =$ $\displaystyle \frac{1}{s}\sum_{k=1}^\infty s^kP(X_\infty+Z=k)+P(X_\infty+Z=0)$  
          $\displaystyle =$ $\displaystyle s^{-1}g_{\boldsymbol{\pi}}(s)g_Z(s) +(s^{-1}-1) P(X_\infty+Z=0)\,,$  

        i.e.

        $\displaystyle g_{\boldsymbol{\pi}}(s)=\frac{(s-1)P(X_\infty+Z=0)}{s-g_Z(s)}\,.$ (69)

      • As

        $\displaystyle \lim_{s \uparrow1}g_{\boldsymbol{\pi}}(s)=1$   and$\displaystyle \qquad
\lim_{s\uparrow 1} \frac{d}{ds} g_Z(s)={\mathbb{E}\,}Z\,,
$

        by L'Hospital's rule we can conclude that

        $\displaystyle 1=\frac{P(X_\infty+Z=0)}{1-\rho}\;.
$

      • Hence (68) is a consequence of (69).


  2. Birth and death processes with one reflecting barrier


    • We modify the example of the death and birth process discussed in Section 2.2.3 now considering the infinite state space $ E=\{0,1,\ldots\}$ and the transition matrix

      $\displaystyle {\mathbf{P}}=\left(\begin{array}{cccccccc} 0 & 1 & & & & & & \\  ...
...& q_i & r_i & p_i & \\  & & & &\ddots & \ddots & \ddots& \\  \end{array}\right)$ (70)

      where $ p_i>0$, $ q_i>0$ and $ p_i+q_i+r_i=1$ is assumed for all $ i\in\{1,2,\ldots\}$.
    • The linear equation system $ {\boldsymbol{\alpha}}^\top={\boldsymbol{\alpha}}^\top{\mathbf{P}}$ is of the form

      $\displaystyle \alpha_i=\left\{\begin{array}{ll} p_{i-1}\alpha_{i-1}+r_i\alpha_i...
...pha_{i+1} &\mbox{if $i>0$,}\\  q_1\alpha_1 &\mbox{if $i=0$.} \end{array}\right.$ (71)

    • Similarly to the birth and death processes with two reflecting barriers one can show that
      • the equation system (71) has a uniquely determined probability solution $ {\boldsymbol{\alpha}}^\top$ if

        $\displaystyle \sum\limits_{j=1}^\infty\frac{p_1p_2\cdot\ldots\cdot p_j}{q_1q_2\cdot\ldots\cdot q_{j+1}}<\infty\,,$ (72)

      • the solution $ {\boldsymbol{\alpha}}^\top=(\alpha_0,\alpha_1,\ldots)$ of (71) is given by

        $\displaystyle \alpha_i=\alpha_0\;\frac{p_1p_2\cdot\ldots\cdot
p_{i-1}}{q_1q_2\cdot\ldots\cdot q_i}\;,\qquad\forall\,i>0\,,
$

      • where $ \alpha_0>0$ is defined by the condition $ \sum_{i=0}^\infty
\alpha_i=1$, i.e.

        $\displaystyle \alpha_0\Biggl(1+\frac{1}{q_1}+\sum\limits_{j=1}^\infty\frac{p_1p_2\cdot\ldots\cdot
p_j}{q_1q_2\cdot\ldots\cdot q_{j+1}}\Biggr)=1
$

        and, consequently,

        $\displaystyle \alpha_0=\Biggl(1+\frac{1}{q_1}+
\sum\limits_{j=1}^\infty\frac{p_1p_2\cdot\ldots\cdot
p_j}{q_1q_2\cdot\ldots\cdot q_{j+1}}\Biggr)^{-1}\,.
$

    • As we assume $ p_i>0$ and $ q_i>0$ for all $ i\in\{1,2,\ldots\}$ birth and death processes with one reflecting barrier are obviously irreducible.
    • Furthermore, if $ r_i>0$ for some $ i\in\{1,2\ldots,\}$ then birth and death processes with one reflecting barrier are also aperiodic (as well as ergodic if the contraction condition (72) is satisfied).



next up previous contents
Next: Direct and Iterative Computation Up: Ergodicity and Stationarity Previous: Irreducible and Aperiodic Markov   Contents
Ursa Pantle 2006-07-20