2

With $X_1$, $X_2$ and $X_3$ being independent random variables, how can I compute $\mathbb{E}\left[ \frac{X_1}{X_1+X_2+X_3}\right]$?

Is $\mathbb{E}\left[ \frac{X_1}{X_1+X_2+X_3}\right] = \frac{\mathbb{E}\left[X_1\right]}{\mathbb{E}\left[X_1\right]+\mathbb{E}\left[X_2\right]+\mathbb{E}\left[X_3\right]}$? If not, how is it calculated?

Thank you in advance for any clarification.

Xander
  • 23
  • 2

1 Answers1

4

There is a famous "folk theorem" residual to my undergrad classes, namely that $$\mathbb E[X/Y] = \mathbb E[X] \big/ \mathbb E[Y]$$ and it is often permeates during exams, as it makes computing much easier. Sadly, the equality does not hold in general, even when $X$ and $Y$ are independent (due to Jensen's inequality).

In the case of $\mathbb E[X_1/X_1+X_2+X_3]$, numerator and denominator are dependent, which usually makes the computation more difficult. However, in the very special case when the three $X_i$'s are iid, $X_i/X_1+X_2+X_3$ has the same distribution for all three $i$'s and this leads to an obvious conclusion concerning the expectation of any of them. Assuming this expectation exists, of course. A counterexample is provided by a triplet of Normal variables (see Marsaglia's paper in connection).

As a special case where the identity works, take the Dirichlet $\mathcal D(\alpha_1,\ldots,\alpha_d)$ distribution, whose expectation is $$\mathbb E[Y_i]=\alpha_i\Big/\sum_{j=1}^d \alpha_j$$ One representation of a Dirichlet random vector $(Y_1,\ldots,Y_d)$ is $$Y_i=\frac{X_i}{X_1+\ldots+X_d}\qquad X_i\sim\mathcal G(\alpha_i,1)$$ where the $X_i$'s are independent. In that case, $$\mathbb E[Y_i]=\mathbb E[X_i\big/X_1+\ldots+X_d]= \mathbb E[X_i]\big/\mathbb E[X_1+\ldots+X_d]$$

Xi'an
  • 90,397
  • 9
  • 157
  • 575
  • 1
    The identity will work whenever the $X_i$ are *iid* with nonzero expectation, as you basically pointed out in an earlier version of this answer. – whuber Mar 24 '21 at 18:29
  • @whuber: Is this enough to ensure that $X_1/(X_1+X_2)$ has a well-defined expectation? – Xi'an Mar 24 '21 at 19:51
  • 1
    Good question: I don't think so. For instance, let the distribution be uniform on the values $\{-1,1,3\}.$ $|X_1/(X_1+X_2)|$ equals one divided by zero with a chance of $2/9.$ Continuous approximations to this will have comparable problems. Notice that these random variables are (a) bounded and (b) have zero probability to be zero. – whuber Mar 24 '21 at 20:55
  • 1
    The expression like $X_1/(X_1 + X_2)$ also occured in this question https://stats.stackexchange.com/a/399952/ . There is an expression from Hinkley for the case that the $X_i$ are Gaussian distributed (and the expectation will be infinite). In this question https://stats.stackexchange.com/a/438402 an intuitive view is given for the ratio distribution (and you could do the same for the correlated case). You could also express the distribution of the *angle*. And the expectation of the ratio is the expectation of the tangens of the angle, which becomes infinite when 90 deg has non-zero density. – Sextus Empiricus Mar 25 '21 at 08:01
  • 1
    When the $X_i$ are continuous and non-negative then the division by 0 occurs only in the point $(X_1,X_2) = (0,0)$ (which has zero probability) and for the other values of $X_1,X_2$ the value of the ratio $X_1/(X_1+X_2)$ is between 0 and 1, such that the ratio won't have infinite or undefined expectation. – Sextus Empiricus Mar 25 '21 at 08:18
  • Correct!, the expectation is defined since $X_1/(X_1+\cdots)\le 1$ with probability one. – Xi'an Mar 25 '21 at 09:28