Introduction to Markov chain mixing
Recall if ($X_n)_n$ is an irreducible Markov chain with stationary distribution $\pi$, thhen $\lim_{n \to \infty} \frac{1}{n} \sum_{j=0}^n 1_{[X_j = x]} = \pi(x)$, $ \quad \mathbb{P}_\mu$ - a.s Today's goal We will show that $X_n$ converges to $\pi$ under some "strong sense". Total variation distance The convergence theorem Mixing times There is three ways to characterize the total variation distance: $\mu$ and $\nu$ : probability measures on $\Omega$. $\mid\mid \mu - \nu \mid\mid_{TV}$ = $max_{A\subset \Omega} \mid\mu (A) - \nu (A)\mid$. Lemma $\mid\mid \mu - \nu \mid\mid_{TV} = \frac{1}{2} \sum_{x \in \Omega} \mid \mu (x) - \nu (x)\mid$. $\mid\mid \mu - \nu \mid\mid_{TV} = \frac{1}{2} sup{\mu \f - \nu \f : \f satisfying \max_{x\in \Omega} \mid \f(x) \mid \leq 1}$. $\mid\mid \mu - \nu \mid\mid_{TV} = inf{\quad \mathbb{P} [X \neq Y] : (X, Y) is a coupling of \mu , \nu }. Definition We call (X, Y) the ...