I managed to reduce a problem in my research to the following: Suppose that we have a population of cells which starts with a single cell that has zero mutations, and at each time step each cell divides in two. After each division, one of the daughter cells can gain a mutation with probability $\mu$, and lose a mutation with probability $\beta$, where $\beta < \mu$ (a cell with zero mutations cannot lose any). After a long time, what is the fraction $f_i$ of cells that have $i$ mutations.
I thought that we would have $f_i = (1-\mu-\beta)f_i + \mu f_{i-1} + \beta f_{i+1}$, since the fraction of cells with $i-1$ mutations could mutate with probability $\mu$ to have $i$ mutations, the fraction of cells with $i+1$ mutations could mutate with probability $\beta$ to have $i$ mutations, and the fraction of cells with $i$ mutations would neither lose nor gain any mutations with probability $1-\mu-beta$. This seems to be wrong though, because $f_i = c$ for any constant $c$ satisfies this equation. What is the correct way to solve this problem? Thanks.