0

I have an example of in my textbook of an "embeded markov chain", where I don't understand one step.

Suppose that $(X_n)_{n\geq 0}$ is Markov$(\lambda, P)$. $\lambda$ is the initial distribution and $P$ is the transition matrix.. Let $j\subseteq I$. Construct a random process that is only observed when it hits set $J$. Let $Y_m=X_{T_m}$, where $T_0$$=$inf$\{n\geq0 : X_n \in J\}$ & $T_m=$ inf $\{n\geq T_{m-1} : X_n \in J\}$. Assume $\Bbb P(T_m \lt \infty)=1$ for every $m\in \Bbb N$ . Claim: $(Y_m)_{m\geq 0}$ is a markov chain.

Proof:

Firstly, $T_m$'s are stopping times for every $m\geq 0$.

Next, $\Bbb P(Y_{m+1}=i_{m+1} | Y_0=i_0,...,Y_m=i_m)$

$=\Bbb P(X_{T_{m+1}}=i_{m+1} | X_{T_0}=i_0,..., X_{T_m}=i_m)$

Here comes the part where I don't understand:

(by Strong Markov Property)= $\Bbb P_{i_m}(X_{T_1}=i_{m+1})$ $=\overline P_{i_mi_{m+1}}$ . (How does $X_{T_{m+1}}$ turn into $X_{T_1}$?)

Where $\overline P_{ij}=\Bbb P_i($Next visit to $J$ is state $j)$, which it the smallest solution to the system of linear equations $$\overline P_{ij}= P{ij}+ \sum_{k \neq j}P_{ik}\overline P_{kj}$$

1 Answers1

2

The key realization is that because the transition matrix is constant, if you start at state $i$, the probability that at the next stage you are in any given state $j$ is independent of the time index.

Observe that:

$$\mathbb{P}(X_{T_{m+1}}=i_{m+1}| X_{T_0}=i_0,..., X_{T_m}=i_m) =\mathbb{P}(X_{T_{m+1}}=i_{m+1}|X_{T_m}=i_m)$$

by the Strong Markov Property. Since the transition matrix is constant, the subscript on the time index is also irrelevant; all that matters is the relative subscripts of $t_{m+1}$ and $t_m$, which differ by $1$. So

$$\mathbb{P}(X_{T_{m+1}}=i_{m+1}|X_{T_m}=i_m) = \mathbb{P}(X_{T_{1}}=i_{m+1}|X_{T_{0}}=i_m)$$

and a change of notation from $\mathbb{P}(X_{T_{1}}=i_{m+1}|X_{T_{0}}=i_m)$ to $\mathbb{P}_{i_m}(X_{T_{1}}=i_{m+1})$, changing the conditional term $|X_{T_{0}}=i_m$ to a subscript $i_m$ on $\mathbb{P}$, gets you the rest of the way there.

jbowman
  • 31,550
  • 8
  • 54
  • 107