3

Suppose we flip a biased coin (with probability $p$ being Heads) repeatedly until a certain pattern (e.g., HHHTT) appears. We are interested in the number of flips $N$ required. It is well-known that $\mathbb{E}[N]$ can be calculated using a martingale approach; however, if one would like to simulate the full probability distribution of $N$ (not just the first moment), is there a more efficient (e.g., importance sampling or MCMC) approach than simply performing repeated Monte Carlo simulations (such as that performed here)?

Notice that if the coin is heavily biased or the length of pattern is large, it may take a very large number of trials to get a single sample.

Also, is there a simpler approach for $p = 1/2$?

p-value
  • 216
  • 1
  • 5
  • 1
    @p-value How do you define efficiency? The most natural way to define efficiency of Importance Sampling or MCMC requires a test function, i.e. we want an (unbiased) estimator $\hat I$ of $I=E[h(N)]$ for some function $h$, and we look to minimize $Var(\hat I)$. – Robin Ryder Feb 25 '19 at 20:25
  • @RobinRyder I agree with what your said. In terms of efficiency, I'm interested in both statistical efficiency (which relates to the variance of test functions as you mentioned) and computational efficiency (better exploring the distribution using fewer Monte Carlo samples than naive MC). I agree this is still not exactly well-defined, but I'm interested in any approach that substantially improves upon simple MC in any way. – p-value Feb 28 '19 at 18:42

0 Answers0