Posts

Introduction to Stochastic Processes

Domain Definition $\Omega$: finite state space P: transition matrix $| \Omega | \times |\Omega|$ we will see it multiple time (it mean what is the probability of moving from state $S_x$ to state $S_y$). A sequence of random variables ($X_0,X_1,X_2,\ldots) is a Markov chain with state $\Omega$ and transition matix P if for all n $\geq$ 0, and all sequences ($x_0, x_1,\ldots,x_n,x_{n+1}), we have that      $\mathbb{P}[X_{n+1} = x_{n+1} | X_0 = x_0,\ldots,X_n=x_n] = \mathbb{P}[X_{n+1} = x_{n+1} | X_n = x_n] = P(x_n, x_{n+1})$.  Gambler's ruin     The gambler's situation can be modeled by a Markov chain on the state space {0, 1, ..., N}: $X_0$: initial money in purse $X_n$: the gambler's fortune at time n $\mathbb{P}[X_{n+1} = X_n + 1 | X_n] = 1/2$ $\mathbb{P}[X_{n+1} = X_n - 1 | X_n] = 1/2$ The states 0 and N are absorbin. $\tau$: the time that the gambler stops. Theorem: Assume that $X_0 = k for some 0 \leq k \geq N$. Then  ...

Introduction to Markov chain mixing

 Recall  if ($X_n)_n$ is an irreducible Markov chain with stationary distribution $\pi$, thhen  $\lim_{n \to \infty} \frac{1}{n} \sum_{j=0}^n 1_{[X_j = x]} = \pi(x)$,    $ \quad \mathbb{P}_\mu$ - a.s Today's goal  We will show that $X_n$ converges to $\pi$ under some "strong sense". Total variation distance The convergence theorem Mixing times There is three ways to characterize the total variation distance: $\mu$ and $\nu$ : probability measures on $\Omega$.          $\mid\mid \mu - \nu \mid\mid_{TV}$ = $max_{A\subset \Omega} \mid\mu (A) - \nu (A)\mid$. Lemma   $\mid\mid \mu - \nu \mid\mid_{TV}  = \frac{1}{2} \sum_{x \in \Omega} \mid \mu (x) - \nu (x)\mid$. $\mid\mid \mu - \nu \mid\mid_{TV} = \frac{1}{2} sup{\mu \f - \nu \f : \f satisfying \max_{x\in \Omega} \mid \f(x) \mid \leq 1}$. $\mid\mid \mu - \nu \mid\mid_{TV} = inf{\quad \mathbb{P} [X \neq Y] : (X, Y) is a coupling of \mu , \nu }. Definition We call (X, Y) the ...

Markov chains: time-reversal

 Ergodic Theorem Let $f$ be a real-valued function defined on $\Omega$. If $(X_n)_n$ is an irreducible Markov chain with stationary distribution $\pi$, then for any starting distribution $\mu$, we have          $\lim_{n \to \infty} \frac{1}{n} \sum_{j=0}^n{f(X_j)} = \pi f$,  $\quad \mathbb{P}_\mu$ - almost surely (a.s.) In particular, $\lim_{n \to \infty} \frac{1}{n} \sum_{j=0}^n 1_{[X_j = x]} = \pi(x)$,    $ \quad \mathbb{P}_\mu$ - a.s Detailed balance equations Definition Suppose that a probability measure $\pi$ on $\Omega$ satisfies                         $\pi(x)$P(x,y) = $\pi(y)$P(y,x),  $\quad \forall$ x, y $\in \Omega$. These are called detailed balance equations. Lemma Any distribution $\pi$ satisfying the detailed balance equations is stationary for P. Definition A chain satisfying detailed balance equations is called reversible. Simple random walk on grap...

Markov chains: stationary distribution

 Irreducible Definition      A transition matrix P is called irreducible, if for any x, y $\in \Omega$, there exists a number n (possibly depending on x,y) such that           $P^n(x,y) > 0$.     For any x $\in \Omega, define T(x) = {n \geq 1: P^n(x,x) > 0}$. The period of state x is the greatest common divisor of T(x), denoted by gcd(T(x)). Lemma if P is irreducible, then gcd(T(x)) = gcd(T(y)) for all x, y $\in \Omega$. Aperiodic Definition For an irreducible chain, the period of the chain is defined to be the period which is common to all states. the chain is aperiodic if all states have period 1. Example: Consider a simple random walk on N-cycle where N is odd. Then the walk is irreducible and aperiodic. Theorem If P is irreducible and aperiodic, then there exists an integer r such that                          $P^n(x,y) > 0, \forall x, y \...