Markov chains: time-reversal

 Ergodic Theorem

Let $f$ be a real-valued function defined on $\Omega$. If $(X_n)_n$ is an irreducible Markov chain with stationary distribution $\pi$, then for any starting distribution $\mu$, we have
        $\lim_{n \to \infty} \frac{1}{n} \sum_{j=0}^n{f(X_j)} = \pi f$,  $\quad \mathbb{P}_\mu$ - almost surely (a.s.)

In particular, $\lim_{n \to \infty} \frac{1}{n} \sum_{j=0}^n 1_{[X_j = x]} = \pi(x)$,  $ \quad \mathbb{P}_\mu$ - a.s

Detailed balance equations

Definition Suppose that a probability measure $\pi$ on $\Omega$ satisfies
                    $\pi(x)$P(x,y) = $\pi(y)$P(y,x), $\quad \forall$ x, y $\in \Omega$.
These are called detailed balance equations.
Lemma Any distribution $\pi$ satisfying the detailed balance equations is stationary for P.
Definition A chain satisfying detailed balance equations is called reversible.

Simple random walk on graph

Example: consider simple random walk on graph G = (V, E) (which is connected). The measure
              $\pi(x) = \frac{\deg(x)}{2|E|} , \quad x \in \Omega$
satisfies detailed balance equations; therefore the simple random walk on G is reversible.

Time-reversal of Markov chain

Theorem
Let $(X_n)$ be an irreducible Markov chain with transition matrix P and stationary distribution $\pi$. Define $\hat P$ to be
                    $\hat P(x,y) = \frac{\pi(y) P(y, x)}{\pi(x)}$.
  • $\hat P$ is stochastic 
  • Let ($\hat X_n $) be a Markov chain with transition matrix $\hat P$. Then $\pi$ is also stationary for $\hat P$.
  • For any $x_0,\ldots, x_n \in \Omega$, we have
                    $\mathbb{P}_\pi [X_0 = x_0, \ldots, X_n = x_n] = \mathbb{P}_\pi[\hat X_0 = x_n, \ldots, \hat X_n = x_0]$.
We call $\hat X$ the time-reversal of X.
Remark if a chain with transition matrix P is reversible, then $\hat P$ = P and $\hat X$ has the same law as X.

Birth-and-Death chains

A birth-and-death chain has state space $\Omega$ = {0, 1, ..., N}.
The current state can be though of as the size of some population; in a single step of the chain there can be at most one birth or death. The transition probabilities can be specified by {$(p_k, r_k, q_k)_{k=0}^N$} where $p_k + r_k + q_k = 1$ for each k and  
  • $p_k$ is the probability of moving from k to k+1 when 0 $\leq  k < N$  $\quad P_N = 0$
  • $q_k$ is the probability of moving from k to k-1 when 0 < k $\leq$ N $\quad q_0 = 0$
  • $r_k$ is the probability of remaining at k when 0 $\leq k \leq$ N.
Theorem Every birth-and-death chain is reversible.

Total variation distance

Definition The total variation distance between two probability measures $\mu$ and $\nu$ on $\Omega$ is defined by
                $\|\mu - \nu\|_{TV} = \max_{A \subset \Omega} | \mu(A) - \nu(A)|$.
Lemma The total variation distance satisfies triangle inequality :
            $\|\mu - \nu\|_{TV} \leq \|\mu - \eta\|_{TV} + \|\eta - \nu\|_{TV}$.
Lemma 
            $\|\mu - \nu\|_{TV} = \frac{1}{2} \sum_{x \in \Omega} |\mu(x) - \nu(x)|$.
Lemma
            $\|\mu - \nu\|_{TV} = \frac{1}{2} sup\{ \mu f - \nu f : f$ satisfying $\max_{x \in \Omega} |f(x)| \leq 1 \}$.

Definition A coupling of two probability measures $\mu$ and $\nu$ is a pair of random variables (X, Y) defined on the same probability space such that the marginal law of X is $\mu$ and the marginal law of Y is $\nu$.
Lemma $\|\mu - \nu\|_{TV} = inf\{ \mathbb{P}[X \neq$ Y] : (X, Y) is a coupling of $\mu, \nu \}$.
Definition We call (X, Y) the optimal coupling if $\mathbb{P}[X \neq Y] = \|\mu - \nu\|_{TV}$.

Comments

Popular posts from this blog

Introduction to Stochastic Processes

Markov chains: stationary distribution