0

I've got the following random variable for which I must find the expected value and variance:
$X_h =\sum_{i=1}^{15} X_i$
Where $X_i$ is a random variable of the set $s = \{0, 1, 3\}$, corresponding to the probability density function $f(x) = \left\{ \begin{array}{ll} 0.6 & \mbox{if } x = 3 \\ 0.2 & \mbox{otherwise} \end{array} \right.$.

I got the expected value by simply stating that:

$E(X_h) = E(\sum_{i=1}^{15} X_i) = \sum_{i=1}^{15} E(X_i) = \sum_{i=1}^{15} (\sum_s x\cdot f_x(x)) =\sum_{i=1}^{15} (0\cdot 0.2 + 1 \cdot 0.2 + 3 \cdot 0.6) = 30$

I did the following for the variance (which appears to be incorrect):
$Var(X_h) = E(X_h^2)-E(X_h)^2 = E((\sum_{i=1}^{15} X_i)\cdot(\sum_{i=1}^{15} X_i)) - E(X_h)^2$

$=E(\sum_{i=1}^{15}\sum_{j=1}^{15} X_i\cdot X_j) - E(X_h)^2$

$=\sum_{i=1}^{15}\sum_{j=1}^{15} E(X_i\cdot X_j) - E(X_h)^2$

$= \sum_{i=1}^{15}\sum_{j=1}^{15} (\sum_s X_i\cdot X_j \cdot f_x(X_i)) - E(X_h)^2$

$= \sum_{i=1}^{15}\sum_{j=1}^{15} (\sum_s X_i^2 \cdot f_x(X_i)) - E(X_h)^2$

$= \sum_{i=1}^{15}\sum_{j=1}^{15} (3^2 \cdot 0.6+1^2\cdot 0.2 + 0^2\cdot 0.2) - E(X_h)^2$

$= 15^2(3^2 \cdot 0.6+1^2\cdot 0.2 + 0^2\cdot 0.2) - 30^2$

$= 360$

Now my solution paper claims the following:
$Var(X_h) = \sum_{i=1}^{15} Var(X_i) = 15 \cdot 1.6 = 24$

I've been using this answer to compute the variance here but I think there may be a flaw somewhere. Is correct that even if the variables are independent, that calculating variance formally should yield the same result? What did I do wrong?

  • I don't understand your definition of distribution of the $X_i$. – Glen_b Jun 03 '15 at 09:14
  • @Glen_b it's a random variable of the set s with a pdf given above. That's all. The index doesn't really mean anything. – Ultimate Hawk Jun 03 '15 at 09:16
  • The term "Bernoulli" normally refers to a distribution over the values {0,1}, while your definition appears to be a discrete distribution over 3 values, but it's not clearly enough described to be certain (since there's a slight ambiguity in your description of the probability function). If you remove the term "Bernoulli" (or clarify enough to justify it) it may be clear enough. – Glen_b Jun 03 '15 at 09:39

1 Answers1

2

The problem is that you should not write:

$$E(X_i.X_j) = \sum_s X_i.X_j.f_x(X_i)$$

Because the r.v $X_i.X_j$ follows its joint distribution, not the marginal distribution.

Metariat
  • 2,376
  • 4
  • 21
  • 41
  • Then what should be written instead? $\sum_s X_i \cdot X_j f_x(X_i, X_j)$? Also how do I summate this? Could you provide an example? – Ultimate Hawk Jun 03 '15 at 09:25
  • $\sum_s x_i.x_j.f_{X_i,X_j}(x_i,x_j)$ and you need to calculate the joint distribution. – Metariat Jun 03 '15 at 09:28
  • @Xuan_Quang_DO: since they're independent the joint probability distribution is: $f_{X_i}(x_i)\cdot f_{X_j}(x_j)$. Now for what variables does the sum go? Does it become a double sum like: $\sum_{x_i=\{0,1,3\}} \sum_{x_j=\{0,1,3\}} x_i x_j f_{X_i}(x_i) f_{X_j}(x_j)$ ? – Ultimate Hawk Jun 03 '15 at 09:31
  • Yes, but I didn't see the independent hypothesis in your question. And in the independent case, the Var of the sum is simply the sum of Var! – Metariat Jun 03 '15 at 09:36
  • @Xuan_Quang_DO: Yes, that is what the original answer states as well. However, when I summate according to the above and subtract the expected value ($E(X_h^2)-E(X_h)^2$) I get 0. Why is that? – Ultimate Hawk Jun 03 '15 at 09:40
  • May be, you wrongly calculate the double sum $\sum_{x_i} \sum_{x_j}$ There is 9 components (0,0), (0,1), (0,3), (1,0), (1,1), ... and so on, not 3 (0,0), (1,1) and (3,3) – Metariat Jun 03 '15 at 09:47