0

I tried to compute the variance of a squared ratio of 2 Gaussians random variables (not the same means and standard deviations between both). I generate the samples by Monte-Carlo method.

I expect better results than for the 1st order Taylor expansion but it gives for the moment the same values than with 1st order.

I can't explain this result, so if someone could check below my calculation of the variance of this 2nd order expansion in order to see if there is no error , I would be grateful.

Let $f(X, Y)=\frac{X^{2}}{Y^{2}} .$ Consider a second order Taylor expansion around $\theta:=\left(\theta_{x}, \theta_{y}\right)$ where $\theta_{x}, \theta_{y}$ are the mean of $X$ and $Y$, respectively. We have $$ \begin{aligned} \operatorname{Var}(f(X, Y)) \approx \mathbb{E}\left[\left(f_{x}(\theta)\left(X-\theta_{x}\right)+f_{y}(\theta)\left(Y-\theta_{y}\right)+\frac{1}{2} f_{xx}(\theta)\left(X-\theta_{x}\right)^{2}\right.\right.\\ \left.\left.+\frac{1}{2} f_{yy}(\theta) \left(Y-\theta_{y}\right)^{2}+f_{x y}(\theta)\left(X-\theta_{x}\right)\left(Y-\theta_{y}\right)\right)^{2}\right] \end{aligned} $$ where we have used the approximation that $$ E(f(X, Y)) \approx f\left(\theta_{x}, \theta_{y}\right)=f(\theta) $$ which is equivalent to a first order Taylor expansion.

To generate the sample, I need to know the Jacobian making the link between 2 Gaussian correlated (X,Y) distribution and the PDF from which I produce (X^2/Y^2) random variables.

Indeed, naively, I could do :

    mean = np.array([self._mu1, self._mu2])
    cov11 = self._sig1**2
    cov22 = self._sig2**2
    cov12 = self._rho * self._sig1 * self._sig2
    cov = np.array([[cov11, cov12], [cov12, cov22]])
    sample = np.random.multivariate_normal(mean=mean, cov=cov, size=nsample)
    sample =
    x_vec = [sample[i][0] for i in range(nsample)]
    y_vec = [sample[i][1] for i in range(nsample)]

    # Take the square of x_vec and y_vec
    x_vec = np.power(x_vec,2)
    y_vec = np.power(y_vec,2)

But I think this approach is not right since I put square after having the sample (X,Y).

I need to know rather the Jacobian which relates the PDF ~ Normal(X,Y,rho_correlation) and the PDF ~ Normal (X^2,Y^2, rho_correlation').

After this, I could compute directly the variance by doing :

           # Compute directly the mean of squared expression
           return np.mean((self._fx*(x_vec-self._mu1) + self._fy*(y_vec-self._mu2)\
           + 0.5*self._fxx*(x_vec-self._mu1)**2\
           + 0.5*self._fyy*(y_vec-self._mu2)**2\
           + self._fxy*(x_vec-self._mu1)*(y_vec-self._mu2))**2)

But surely, I have to take the squared of mu1, mu2, sig1 and sig2.

If someone could help me to express this Jacobian, this would be fine.

youpilat13
  • 21
  • 1
  • 12
  • You are trying to approximate infinity with your series: see https://stats.stackexchange.com/questions/299722. – whuber May 24 '21 at 21:23
  • @whuber. I don't see why I approximate infinity, my current code doesn't produces divergence or infinity values. I have already implemented the version "1st order" and it produces interesting results. But I expect yet better results with the 2nd order. – youpilat13 May 24 '21 at 21:39
  • The link I provided explained why the variance must be infinite. Thus, if your approximations are any good, they should be diverging as you attempt to improve them. – whuber May 25 '21 at 11:12

0 Answers0