I have an example of in my textbook of an "embeded markov chain", where I don't understand one step.
Suppose that $(X_n)_{n\geq 0}$ is Markov$(\lambda, P)$. $\lambda$ is the initial distribution and $P$ is the transition matrix.. Let $j\subseteq I$. Construct a random process that is only observed when it hits set $J$. Let $Y_m=X_{T_m}$, where $T_0$$=$inf$\{n\geq0 : X_n \in J\}$ & $T_m=$ inf $\{n\geq T_{m-1} : X_n \in J\}$. Assume $\Bbb P(T_m \lt \infty)=1$ for every $m\in \Bbb N$ . Claim: $(Y_m)_{m\geq 0}$ is a markov chain.
Proof:
Firstly, $T_m$'s are stopping times for every $m\geq 0$.
Next, $\Bbb P(Y_{m+1}=i_{m+1} | Y_0=i_0,...,Y_m=i_m)$
$=\Bbb P(X_{T_{m+1}}=i_{m+1} | X_{T_0}=i_0,..., X_{T_m}=i_m)$
Here comes the part where I don't understand:
(by Strong Markov Property)= $\Bbb P_{i_m}(X_{T_1}=i_{m+1})$ $=\overline P_{i_mi_{m+1}}$ . (How does $X_{T_{m+1}}$ turn into $X_{T_1}$?)
Where $\overline P_{ij}=\Bbb P_i($Next visit to $J$ is state $j)$, which it the smallest solution to the system of linear equations $$\overline P_{ij}= P{ij}+ \sum_{k \neq j}P_{ik}\overline P_{kj}$$