5

I am trying to find the expected value of

$\displaystyle E\Bigg[\frac{X}{X+Y}\Bigg]$.

I started with writing

$\displaystyle E\Bigg[\frac{X}{X+Y}\Bigg] = E\Bigg[X\cdot\frac{1}{X+Y}\Bigg] $.

I then noticed that

$E[XY] = \text{cov}(X,Y) + E[X]E[Y]$

which follows from the definition of covariance. So, I have

$\displaystyle E\Bigg[X\cdot\frac{1}{X+Y}\Bigg] = \text{cov}\Big(X, \frac{1}{X+Y}\Big)+E[X]E\Bigg[\frac{1}{X+Y}\Bigg]$

but I don't know how to proceed from here.

The variables $X$ and $Y$ are both normally distributed and positively correlated.

exp
  • 51
  • 2
  • 9
    Finding the expectation requires knowing the bivariate distribution of $(X,Y)$. What are you assuming it is? – whuber Jul 14 '15 at 18:49
  • 1
    Bivariate normal. – exp Jul 14 '15 at 19:09
  • 4
    Assuming you mean $(X,Y)$ has a binormal distribution, you could equally well ask for $E[X/Y]$ because $(X,X+Y)$ will be binormal, too. http://stats.stackexchange.com/questions/157557 is essentially the same question. (IMHO the comments there are more useful than the answers, because the question was unfortunately posed in a way that invites invalid mathematical manipulation.) http://stats.stackexchange.com/questions/36027 is also relevant. – whuber Jul 14 '15 at 19:14
  • 5
    To add to whuber's comment, suppose that jointly normal random variables $X$ and $Y$ have identical marginal distributions (the correlation is not relevant). Then the following _invalid_ mathematical manipulation gives us that $$E\left[\frac{X}{X+Y}\right] + E\left[\frac{Y}{X+Y}\right] = E\left[\frac{X+Y}{X+Y}\right]=E[1] = 1$$ and so since _symmetry_ allows us to infer that $E\left[\frac{X}{X+Y}\right] = E\left[\frac{Y}{X+Y}\right]$, we easily determine that $\displaystyle E\left[\frac{X}{X+Y}\right] = \frac{1}{2}$. What could be easier? So now figure out where the calculations went astray... – Dilip Sarwate Jul 14 '15 at 20:06

2 Answers2

5

If $(X,Y)$ is binormal, then so is $(X,Z) = (X,X+Y)$. The ratio $X/Z$ is the tangent of the slope of the line through the origin and the point $(Z,X)$. When $X$ and $Z$ are uncorrelated with zero means, it is well known (and easy to compute) that $X/Z$ has a Cauchy distribution. Cauchy distributions have no expectations. This should lead us to suspect $X/Z$ might not have a mean, either. Let's see whether it does nor not.

For any angle $0 \lt \theta \lt \pi/2$, consider the event

$$E_\theta = \{(Z,X)\,|\, X \ge Z\cot(\theta\}.$$

This is of interest because its probability is the chance that $X/Z$ exceeds $\cot(\theta)$: the survival function of $X/Z$. It carries all the information of the distribution function of $X/Z$.

$E_\theta$ is a (closed) cone in the plane consisting of all points on all lines making an angle of $\theta$ or less to the right of the vertical ($X$) axis. Let's underestimate the probability of $E_\theta$. To do so, we will work in polar coordinates. Consider any possible radius $\rho$. Among all points of this radius within the set $E_\theta$, the density $f$ of $(Z,X)$ will achieve a minimum value $f_\theta(\rho)$. This minimum must be nonzero provided the density does not degenerate. (More about this possibility later.) Use this to bound the probability

$$\eqalign{ \Pr(E_\theta) &= \int_{\pi/2-\theta}^{\pi/2}\int_0^\infty f(\phi,\rho) \rho d\rho d\phi \\ &\ge \int_{\pi/2-\theta}^{\pi/2}\int_0^\infty \rho f_\theta(\rho) d\rho d\phi \\ &=\theta \int_0^\infty \rho f_\theta(\rho) d\rho \\ &= C(\theta) \theta }$$

where I have written $C(\theta)$ for the integral, which is some positive number depending on $\theta$. Moreover, for $0\lt\theta\lt\pi/2$, $C(\theta)$ has a nonzero lower bound $C \gt 0$.

By definition, the expectation of $X/Z$ is the sum of two parts: one integral for the positive part when $X/Z \ge 0$ and another for the negative part when $X/Z \lt 0$. Let's tackle the positive part. For any positive random variable $W$ with distribution function $F$, integration by parts shows its expectation equals the integral of its survival function $1-F$, since

$$\mathbb{E}(W) = \int_0^\infty w dF(w) = (w(1-F(w))|_0^\infty + \int_0^\infty (1-F(w)) dw = \int_0^\infty (1-F(w)) dw.$$

Applying this to $W = X/Z$ and substituting $w=\cot(\phi)$ gives for the positive part of the integral

$$\eqalign{ \int_0^\infty (1 - F(w)) dw &= \int_0^{\pi/2} (1 - F(\cot(\phi))) \csc^2(\phi) d\phi \\ &= \int_0^{\pi/2} \Pr(E_\phi) \csc^2(\phi) d\phi \\ &\ge C \int_0^\theta \phi \csc^2(\phi) d\phi \\ &\gt C \int_0^\theta \frac{d\phi}{\phi}. }$$

(The final inequality is a simple consequence of the well-known inequalities $0 \lt \sin(\phi) \lt \phi$ for $0 \lt \phi \lt \pi$, which upon taking the $-2$ power gives $\csc^2(\phi) \gt 1/\phi^2$.)

For any $\theta \gt 0$, the last term is a divergent integral, because for $0\lt \epsilon$,

$$\int_0^\theta \frac{d\phi}{\phi} \gt \int_\epsilon^\theta \frac{d\phi}{\phi} = \log(\theta) - \log(\epsilon) \to \infty$$

as $\epsilon \to 0^{+}$.

Consequently, the positive part of the expectation does not exist. It is immediate that the expectation of $X/W$ does not exist, either.

We left behind one exception to consider: when $X/Z$ is supported on a line passing through the origin, this argument breaks down (because then the density can equal zero--and in fact is zero for almost all $\theta$). In this degenerate case, $X/Z$ reduces to a constant--equal to tangent of the slope of that line--and obviously that constant is its expectation. This is the only such situation in which $X/Z$ has an expectation.

whuber
  • 281,159
  • 54
  • 637
  • 1,101
5

This is a follow-up to whuber's answer, and posted as a separate answer because it is too long for a comment.

Lest people think that it is the bivariate normality of $X$ and $Y$ that is causing the problem, it is worth emphasizing that if $W$ is a continuous random variable whose density is nonzero on an open interval containing the origin, then $E\left[\frac 1W\right]$ does not exist. Since $\frac 1w$ diverges to $\pm\infty$ as $w$ approaches $0$, the integral for $E\left[\frac 1W\right]$, which is of the form $$E\left[\frac 1W\right]=\int_{-\infty}^0 \frac 1w f_W(w)\,\mathrm dw + \int_0^{-\infty} \frac 1w f_W(w)\,\mathrm dw\tag{1}$$ is undefined because both integrals on the right side of $(1)$ diverge and the right side of $(1)$ is of the form $\infty-\infty$ (which is undefined).

Dilip Sarwate
  • 41,202
  • 4
  • 94
  • 200