next up previous contents
Next: Estimates for the Rate Up: Ergodicity and Stationarity Previous: Ergodicity and Stationarity   Contents


Basic Definitions and Quasi-positive Transition Matrices


This serves as a motivation to formally introduce the notion of the ergodicity of Markov chains.

Definition
$ \;$ The Markov chain $ X_0,X_1,\ldots$ with transition matrix $ {\mathbf{P}}=(p_{ij})$ and the corresponding $ n$-step transition matrices $ {\mathbf{P}}^{(n)}=(p^{(n)}_{ij})$ $ (={\mathbf{P}}^n$) is called ergodic if the limits

$\displaystyle \pi_j=\lim_{n\to\infty} p_{ij}^{(n)}$ (33)

  1. exist for all $ j\in E$
  2. are positive and independent of $ i\in E$
  3. form a probability function $ {\boldsymbol{\pi}}=(\pi_1,\ldots,\pi_\ell)^\top$, i.e. $ \sum_{j\in E}\pi_j=1$.


Example
$ \;$ (Weather Forecast)


The ergodicity of Markov chains on an arbitrary finite state space can be characterized by the following notion from the theory of positive matrices.


Definition
 


Remark
  If $ {\mathbf{A}}$ is a stochastic matrix and we can find a natural number $ n_0\ge 1$ such that all entries of $ {\mathbf{A}}^{n_0}$ are positive, then it is easy to see that for all natural numbers $ n\ge n_0$ all entries of $ {\mathbf{A}}^n$ are positive.


Theorem 2.4   The Markov chain $ X_0,X_1,\ldots$ with state space $ E=\{1,\ldots,\ell\}$ and transition matrix $ {\mathbf{P}}$ is ergodic if and only if $ {\mathbf{P}}$ is quasi-positive.

Proof
 


Remarks
 


Now we will show that the limits $ \pi_j=\lim_{n\to\infty} p_{ij}^{(n)}$ can be regarded as solution of a system of linear equations.

Theorem 2.5    

Proof
 

Remarks
 


next up previous contents
Next: Estimates for the Rate Up: Ergodicity and Stationarity Previous: Ergodicity and Stationarity   Contents
Ursa Pantle 2006-07-20