Questions tagged [transition-matrix]

A transition matrix is a square matrix used to describe the transitions of a Markov chain.

The term "transition matrix" is used in a number of different contexts in mathematics.

In linear algebra, it is sometimes used to mean a change of coordinates matrix.

In the theory of Markov chains, it is used as an alternate name for for a stochastic matrix, i.e., a matrix that describes transitions.

In control theory, a state-transition matrix is a matrix whose product with the initial state vector gives the state vector at a later time.

94 questions
10
votes
0 answers

Reinforcement *Model* Learning

Classical reinforcement learning (Q- or Sarsa-Learning) can be extended with models of the environment. These models are usually transition tables that contain the probability of arriving at a particular state given another state and one action. In…
9
votes
1 answer

Given two absorbing Markov chains, what is the probability that one will terminate before the other?

I have two different Markov chains, each with one absorbing state and a known starting position. I want to determine the probability that chain 1 will reach an absorbing state in fewer steps than chain 2. I think I can calculate the probability of…
Jeff
  • 3,525
  • 5
  • 27
  • 38
6
votes
1 answer

The expected long run proportion of time the chain spends at $a$ , given that it starts at $c$

Consider the transition matrix: $\begin{bmatrix} \frac{1}{5} & \frac{4}{5} & 0 & 0 & 0 \\ \frac{1}{2} & \frac{1}{2} & 0 & 0 & 0 \\ \frac{1}{5} & \frac{1}{5} & \frac{1}{5} & \frac{1}{5} & \frac{1}{5} \\ 0 & \frac{1}{3} & \frac{1}{3} & \frac{1}{3} & 0…
5
votes
0 answers

Find the invariant measure $\pi=(\pi_{1},\pi_{2},\pi_{3})$ for a Markov Chain with transition matrix given

Let $(X_{n})_{n\in\mathbb{N}_{0}}$ be a Markov Chain with state space $M=\left\{x_{1},x_{2},x_{3}\right\}$ and transtition matrix $$ \Pi=\left(\begin{array}{ccc}p_{1} & p_{2} & 1-p_{1}-p_{2}\\ q_{1} & q_{2} & 1-q_{1}-q_{2}\\ r_{1} & r_{2} &…
5
votes
1 answer

Significance testing for Markov chain transition probabilities

The question How can I calculate p values for individual transitions in a Markov chain? I want to test the null hypothesis that the probability of entering state $B$ from previous state $A$ is less than or equal to the overall probability of being…
4
votes
1 answer

How to show that the transition probability is equal to $\overline p_{ij} = \frac{P_{ij}}{\sum_{k\neq i}p_{ik}}$

(No new answers needed) I would like to award @whuber for his good answer with my bounty! Suppose that $(X_n)_{n≥0}$ is Markov$(λ, P)$ but that we only observe the process when it moves to a new state. Defining a new process as $(Z_m)_{m≥0}$ as the…
user255658
4
votes
1 answer

How to translate an adjacency matrix to a transition matrix for use in Markov cluster algorithm?

I have a matrix of size (47*47 double) that have only 0's and 1's. I want to apply the Markov clustering algorithm on this matrix, but this Method needs a transition matrix as the columns must be normalized to sum to one. Could anyone help me to…
F.caren
  • 93
  • 1
  • 9
4
votes
1 answer

Constructing a transition probability from Q-learning

In Reinforcement learning, learning without the need for the transition probability matrix is 'model free learning'. Instead of having the transition probabilities, we learn the q-values (state/action functions), eventually getting the optimal…
cgo
  • 7,445
  • 10
  • 42
  • 61
4
votes
4 answers

Essential transient state in a Markov chain

Can a finite state Markov chain have essential transient state? I have found out an example for an infinite state one and I have the intuition (I may be wrong) that for a finite state space .. This isn't possible... But I am not being able to prove…
Qwerty
  • 1,047
  • 10
  • 22
4
votes
2 answers

Transition probabilities - Markov chains

I have a homogeneous Markov chain with transition matrix I want to compute $P(Y_1 = 1| Y_2=2)$ where $Y_t, t=1,2$ is the observation at time $t$ and $Y_0=3$. I tried with Bayes' rule, so $$P(Y_1 = 1| Y_2=2)=…
Bux
  • 71
  • 4
4
votes
1 answer

First-order Discrete Markov Chain with time lag

I want to estimate the first-order transition matrix of a sequence in discrete time, e.g. $$ s = 1,0,1,0,1,1,0,1,0,0, \dots$$ but states are not evenly spaced in time. So that even if $s_{t=1} = 1$ and $s_{t = 2} = 0$, and $s_{t=3} = 1$ and $s_{t =…
stochazesthai
  • 4,616
  • 2
  • 18
  • 26
3
votes
1 answer

Finding steady-state probability of a Markov chain

Let $X_{n}$ be a Markov chain on state space $S = \{ 1,2 \dots, 23 \}$ with transition probability given by $p_{i,i+1}= p_{i,i-1} = \frac {1}{2} \ \ \forall \ 2\le i \le 22 , $ $ p_{1,2}= p_{1,23} = \frac {1}{2} $ $ p_{23,1}= p_{23,22} = \frac…
3
votes
2 answers

markov chain with probability trends

I have clients with debts that can pass from states own 1 bill, own 2 bills, own 3 bills, leave the service, new debtors and owe nothing. So I could calculate the probabilities of being in state 1bill and going back to own nothing, and all the rest…
GabyLP
  • 641
  • 6
  • 13
2
votes
3 answers

Probability of doing a specific Path in a Markov Chain

My problem is the following: I have this graph, representing a Markov Chain: For example, if I am in state 1, the probability of going in state 2 or 4 is $\frac{1}{2}$. So I'm saying that the probability of going in one specific state is uniform.…
2
votes
1 answer

Number of states in HMM

I am testing a HMM model by generating data from a 3x3 transition matrix and 3x4 emission matrix and then trying to train a HMM model against this data with different initializations. When I plot the log-likelihood of the observations given the…
1
2 3 4 5 6 7