I am a bit new to statistics and had some conceptual questions regarding the calculation of variance.
I want to calculate the variance of a function $y=\frac{\sigma_{X}}{g(\overline{X})}$. As seen in the equation, the numerator ($\sigma_{X}$) of $y$ is the standard deviation of a random variable, $X$ and the denominator is a function ($g$) of the mean ($\overline{X}$) of the same random variable $X$. If $X$ has a normal distribution, the uncertainties of the standard deviation and the mean are well defined. I am thinking of using the Taylor's approximation for calculating the variance of $y$, $ \sigma^2_y= \frac{\sigma^2_{X}}{\sqrt{2\cdot(N-1)}\cdot g(\overline{X})^2}+(\frac{\partial{y}}{\partial{g}})^2(\frac{\partial{g}}{\partial{\overline{X}}})^2\frac{\sigma_{X}^2}{N}$. However, I am a bit confused about the covariance between the standard deviation of the random variable $\sigma_{X}$ and its mean $\overline{X}$, does it exist? If yes, how can it be calculated ?