5

We play the game, where probability that you win a game is $p$. You play until you win $k$ consecutive games. What is expected number of games you win? I can compute that expected number of games is $\frac{1-p^k}{p^k}\frac1{1-p}$.

Adrien
  • 478
  • 4
  • 9
Ethan
  • 463
  • 3
  • 7
  • 1
    Show your work, then. – StasK Apr 05 '18 at 15:12
  • it can be represented as Markov Chain, where transition probability from $n$ to $n+1$ is $p$ as $k$ is absorbing. – Ethan Apr 05 '18 at 15:49
  • I see -- the count of wins does not get accumulated that way. You can probably get fairly close by multiplying by $p$, but this may or may not be the right answer. I think you need to read up on rank nonparametric statistics -- what you are describing seems related to run tests (for independence in Bernoulli trials). – StasK Apr 05 '18 at 22:38
  • @StasK You are correct: it's not just fairly close, it's the right answer. – whuber Apr 06 '18 at 14:57

2 Answers2

4

The answer is the expected number of games times the expected number of wins per game.

It would be reasonable to doubt this result: sure, it might be close; but couldn't it miss the mark slightly due to the fact that the game ends (by design) with a large streak of wins? Let's find a rigorous way to address this question.

Because the question is only about the number of wins, ignore all losses. The course of the game amounts to a series of runs of wins. It ends when a run of $k$ or more wins is observed.

This description is equivalent to the following more general situation. Let $f$ give the probabilities of any discrete distribution on the positive integers $\{1,2,3,\ldots\},$ let $X_i \sim_{iid} f,\ i=1, 2, 3, \ldots$ be a sequence of independent random variables with that distribution, define $\tau = \min\{i\mid X_i\ge k\}$ to be the stopping time for this process, and set

$$X = \sum_{i=1}^\tau \min(k, X_i) = X_1 + X_2 + \cdots + X_{\tau-1} + k.$$

We seek the value of $e = \mathbb{E}(X),$ the total of the $X_i$ until the process is stopped, counting only $k$ for the value of $X_\tau.$

There is a simple and obvious recursion: either $X_1 \ge k$ and the game stops or else we win $X_1$ games and everything starts all over again. Writing $\pi_i=f(i)$ for $i=1,2,\ldots, k-1$ and

$$\pi_k = f(k) + f(k+1) + f(k+2) + \cdots = 1 - (\pi_1 + \pi_2 + \cdots + \pi_{k-1})$$

for the chance of $k$ or more runs, we stop the game and collect $k$ more wins with probability $\pi_k$. Otherwise we collect $i$ wins and restart the game from the beginning with probabilities $\pi_i,\ i=1, 2, \ldots, k-1.$ Equating the expectations gives

$$e = k\pi_k + \sum_{i=1}^{k-1} \pi_i(i + e).$$

The unique solution of this linear equation for $e$ is

$$e = \frac{1}{\pi_k} \sum_{i=1}^k i \pi_i.$$

In the situation given by the problem, $\pi_i \propto p^i$ for $i\lt k$ and

$$\pi_k \propto p^k + p^{k+1} + p^{k+2} + \cdots = \frac{p^k}{1-p}.$$

The sum of all $k$ values is $p/(1-p),$ whence

$$\eqalign{ \pi_i &= (1-p)p^{i-1}, & i=1, 2, \ldots, k-1; \\ \pi_k &= p^{k-1}.& }$$

This implies

$$e = p^{1-k} \frac{1-p^k}{1-p},$$

exactly $p$ times the expected number of games.

whuber
  • 281,159
  • 54
  • 637
  • 1,101
3

The fact that it is the win probability $p$ times the expected number of games is a simple application of the optional stopping theorem

Let

  • $X_n$ be the total number of wins at round $n$. The recentered random variable $Y_n = X_n - p\ n$ is a martingale.

  • $\tau$ be the time till you win $k$ consecutive games. $\tau$ is obviously a stopping time.

You proved that $\mathbb E[\tau]$ is finite, the increments of $Y_n$ are obviously bounded, so the conditions of case (b) of the Wikipedia article of the optional stopping theorem are true. The optional stopping theorem then tells us that: $$ \mathbb E[Y_\tau] = \mathbb E[Y_0] \\ = 0$$ in our case. And thus: $$ \mathbb E[X_\tau - p \ \tau] = 0$$ $$ \mathbb E[X_\tau] = p \ \mathbb E[\tau]$$ Just plug-in the expected value of the stopping time you computed, and you have the result.

Adrien
  • 478
  • 4
  • 9