2

What is the expected number of coin tosses it takes to observe tails followed by 2 consecutive heads, given that the coin is fair?

I have calculated the expected number of coin tosses for 2 consecutive heads (=6), is there a way to use this piece of information to calculate the above question? Or is there a way to tackle this question completely from scratch?

Alexis
  • 26,219
  • 5
  • 78
  • 131
jaclynx
  • 23
  • 4
  • 1
    I think that question is different enough to warrant a seperate question. In the linked question the number of trials is fixed. Here, the number of tosses is random. – Xiaomi Oct 19 '18 at 04:19
  • In addition to the approach I posted below, there is a conditioning approach that uses a conditional r.v. such as "number of flips to THH given we have T" and "number of flips to THH given we have TH" which simplifies the problem greatly. If I find my notes, I'll try to post that too. – SecretAgentMan Oct 19 '18 at 04:20
  • The duplicate I have identified is an *exact* duplicate. The key word to search on is "Penney" (this is called "Penney's Game"). – whuber Oct 19 '18 at 11:58

1 Answers1

4

There are many ways to approach this problem. Here is one using an absorbing Discrete Time Markov Chain (DTMC) to handle all the conditioning via the fundamental matrix. If we have a fair coin, assume coin flips are independent and $p=Pr(Heads) = 0.5$. Let $N$ be the number of coin flips to reach the desired pattern.

Let $\{X_n,n=1,2,3,\ldots\}$ be a DTMC with one-step transition probability matrix $\mathbf{P}$ where $P_{ij}$ is the probability of transitioning to state $j$ from state $i$ (in one step).

The states are $X_n \in \{0, T, TH, THH\}$. The transition diagram is below.

TransitionDiagram

The transition matrix is then $$\mathbf{P}= \left[ {\begin{array}{c c c c} p & 1-p & 0 & 0 \\ 0 & 1-p & p & 0 \\ 0 & 1-p & 0 & p \\ 0 & 0 & 0 & 1 \end{array} } \right].$$

In canonical form, this can be organized into block matrices denoted by $$ \mathbf{P} = \left[ {\begin{array}{c c c c c c} \mathbf{Q} & \mathbf{R} \\ \mathbf{0} & \mathbf{I} \\ \end{array} }\right],$$

where $\mathbf{Q}$ is a 3x3 square matrix, $\mathbf{R}$ is a 3×1 column vector, $\mathbf{0}$ is a 1x3 row vector, and $\mathbf{I}=[1]$ is a trivial 1×1 identity matrix.

The fundamental matrix $$\mathbf{S}=\left(\mathbf{I}_{3\times 3}-\mathbf{Q}\right)^{-1}$$ has the interpretation that $S_{ij}$ is the expected number of times the process is in transient state $j$ given that it started in transient state $i$.

Summing across the $i$th row of $\mathbf{S}$ gives the total number of transitions (flips) until absorption (achieving desired pattern) given you started in state $i$. Since we're starting with 0 coin flips (the first state), the desired quantity is then $$\text{E}[N] = \sum_{j=1}^{3} S_{1j}=8$$ or, written with the state notation, $$\text{E}[N] = \sum_{j\in\{0, T, TH\}} S_{0j}=8$$

Example MATLAB code:

p = 0.5;   % prob(Heads)
P = [p 1-p 0 0;
    0 1-p p 0;
    0 1-p 0 p;
    0 0 0 1];
Q = P(1:3,1:3);
S = inv(eye(size(Q))-Q);
avgN = sum(S(1,:))

avgN = 8

SecretAgentMan
  • 1,463
  • 10
  • 30
  • 1
    +1 Great answer overall. I am a little confused by your transition matrix. Shouldn't the a11 entry of the transition matrix, for example, be p, because the probability of staying in the "0" state is the probability of flipping Heads = p? Instead, you have it as 1-p. – ColorStatistics Jan 04 '21 at 14:58
  • @ColorStatistics Great catch, I'll fix the MathJax. I think the code is already correct. – SecretAgentMan Jan 05 '21 at 17:14
  • Looks great. Thank you for addresing it. – ColorStatistics Jan 05 '21 at 18:01