Suppose we flip a biased coin (with probability $p$ being Heads) repeatedly until a certain pattern (e.g., HHHTT) appears. We are interested in the number of flips $N$ required. It is well-known that $\mathbb{E}[N]$ can be calculated using a martingale approach; however, if one would like to simulate the full probability distribution of $N$ (not just the first moment), is there a more efficient (e.g., importance sampling or MCMC) approach than simply performing repeated Monte Carlo simulations (such as that performed here)?
Notice that if the coin is heavily biased or the length of pattern is large, it may take a very large number of trials to get a single sample.
Also, is there a simpler approach for $p = 1/2$?