0

I am a bit new to statistics and had some conceptual questions regarding the calculation of variance.

I want to calculate the variance of a function $y=\frac{\sigma_{X}}{g(\overline{X})}$. As seen in the equation, the numerator ($\sigma_{X}$) of $y$ is the standard deviation of a random variable, $X$ and the denominator is a function ($g$) of the mean ($\overline{X}$) of the same random variable $X$. If $X$ has a normal distribution, the uncertainties of the standard deviation and the mean are well defined. I am thinking of using the Taylor's approximation for calculating the variance of $y$, $ \sigma^2_y= \frac{\sigma^2_{X}}{\sqrt{2\cdot(N-1)}\cdot g(\overline{X})^2}+(\frac{\partial{y}}{\partial{g}})^2(\frac{\partial{g}}{\partial{\overline{X}}})^2\frac{\sigma_{X}^2}{N}$. However, I am a bit confused about the covariance between the standard deviation of the random variable $\sigma_{X}$ and its mean $\overline{X}$, does it exist? If yes, how can it be calculated ?

Ab21
  • 1
  • 1
    You lost me at "the uncertainties ... are well defined," because the SD of a Normal distribution has no uncertainty: it's a number, pure and simple. Do you perhaps mean the *sample standard deviation*? Whether the Taylor expansion can be made to work will depend on the properties of $g.$ In particular, if $g$ is differentiable at at least one point where its value is zero, then $y$ won't even have an expectation and its variance will be infinite: see https://stats.stackexchange.com/questions/299722. – whuber Aug 03 '21 at 17:55
  • 2
    The answer to this post: https://stats.stackexchange.com/questions/21875/delta-method-and-correlated-variables may be helpful. For Normally-distributed data, the correlation between the sample mean and sample variance = 0, as it happens, so you have no worries there. – jbowman Aug 03 '21 at 17:57
  • Ok, thanks ! yeah they are uncorrelated! – Ab21 Aug 04 '21 at 14:16

0 Answers0