Let $h$ be some bounded non-negative function. Assume that some random quantity $\mu^N (h)$ be some random quantity with almost sure limit $\mu(h) > 0$. For instance we could have $\mu^N(h) = N^{-1} \sum_{i = 1}^N h(X^i)$ with $X^{1:N} \overset{i.i.d.}{\sim} \mu$.
Furthermore, assume that the difference $\mu^N(h) - \mu(h)$ is subgaussian, i.e. $$ \mathbb{P}( \big| \mu^N(h) - \mu(h) \big| > \epsilon ) \leq \exp(-CN\epsilon ^2) .$$ As a consequence, the $L^2$ norm of the difference goes to zero. I am wondering if this also the case for the inverse, i.e. does $ \left\| \frac{1}{\mu^N(h)} - \frac{1}{\mu(h)} \right\|_2$ go to zero when N goes to infinity?
If $h$ is lower bounded this is true, but i would like to know if this is also the case without this assumption.
I've tried to conjecture the result for the sample mean and it seems that the $L2$ norm goes to zero as long as the limit $\mu(h)$ is not too small.