2

What are some strategies for estimating the variance of a statistic that is an arbitrary function of multiple random variables? I am familiar with the Delta method which uses asymptotics to estimate the variance of a statistic when the function takes a single random variable as its input (e.g. $g(X) = log(X)$). But what are methods for when the statistic involves two or three random variables, such as:

$g(X,Y,Z) = log(X+Z)-log(Y+Z)$

Note in this case the random variables are i.i.d binomials whose mean and variance I have estimated from count data.

agartland
  • 121
  • 3
  • I'm not very familiar with these, but just earlier today I was reading a paper by Preacher & Selig (2012) that discusses this (albeit perhaps indirectly), available here: http://www.quantpsy.org/pubs/preacher_selig_2012.pdf They use two simultaneous regression equations to discuss several methods used to estimate the distribution of the product of two coefficients. – Patrick Coulombe Feb 14 '15 at 22:26
  • In your example, $g$ has no variance. – whuber Feb 14 '15 at 22:30
  • If you are okay with not having a closed form solution, simulation should be helpful. See my answer here: http://stats.stackexchange.com/questions/135665/when-to-use-simulations/135666#135666 – TrynnaDoStat Feb 14 '15 at 23:51
  • Thanks for the link about using simulation. The total variance will be some function of $p_X$, $p_Y$ and $p_Z$. Would you simulate using observed values $\hat{p}$? I'd be interested to see how such a simulation could be used to derive a CI on the final statistic. – agartland Feb 17 '15 at 21:52
  • Because both logarithms have positive chances of being applied to arguments that are zero, the modified example has infinite variance and undefined mean. – whuber Feb 17 '15 at 22:03

0 Answers0