5

I am new to statistics and I happen to came across this property of MGF:

Let $X$ and $Y$ be independent random variables. Let $Z$ be equal to $X$, with probability $p$, and equal to $Y$, with probability $1 − p$. Then, $$M_Z(s)= p M_X(s) + (1 − p) M_Y(s).$$

The proof is given that

$$M_Z(s)= E[e^{s Z}]= p E[e^{s X}] + (1 − p)E[e^{s Y}]= p M_X (s) + (1 − p)M_Y (s)$$

But I do not understand, can someone show me a full proof as in showing the conditioning on the random choice between X and Y, as in why the following holds

$$M_Z(s)= E[e^{s Z}]= p E[e^{s X}] + (1 − p)E[e^{s Y}]$$

Thanks very much.

user23672
  • 51
  • 1
  • 3
  • Apply the law of total probability to get the distribution of $Z$. Then compute the expectation. – Dilip Sarwate Mar 29 '13 at 13:24
  • Is it possible to show some working? I don't see how the law of total probability and expectation works here. Sorry, I am new in stats – user23672 Mar 29 '13 at 13:40
  • The law of total probability allows you to express the law of $Z$ as a weighted sum (weights $p$ and $(1-p)$) of the laws of $X$ and $Y$. Then, $E[e^{tZ}]$ can be calculated from the law of $Z$. I assume that you know the formula for finding $E[g(Z)]$ using the law of $Z$? The sum (or integral) for $E[e^{tZ}]$ can then be expressed as a weighted sum of $E[e^{tX}]$ and $E[e^{tY}]$. – Dilip Sarwate Mar 29 '13 at 13:54
  • Each of these three lines says the same thing: it is an *axiom* of probability that conditional probabilities multiply in this way, whence so do expectations (and, of course, MGFs, because they are expectations). – whuber Mar 29 '13 at 13:58
  • Dilip, I assume that you know the formula for finding E[g(Z)] using the law of Z? (for this, no I don't). How do u find the distribution of Z to get its expectation? – user23672 Mar 29 '13 at 14:37
  • "the formula for finding $E[g(Z)]$ using the law of $Z$" is called the _[the law of the unconscious statistician](http://en.wikipedia.org/wiki/Law_of_the_unconscious_statistician)_. – Dilip Sarwate Mar 30 '13 at 01:57

2 Answers2

3

You may also see it this way: consider another Bernoulli RV $\Theta$ which is $0$ with probability $p$ and $1$ with probability $1-p$ (so $P(\Theta = 0) = p, P(\Theta = 1) = 1-p$). This variable selects either $X$ or $Y$ with the given probability $p$ and $1-p$. Then the PMF of $Z$ is $$P(Z = k) = P(X = k, \Theta = 0) + P(Y = k, \Theta = 1)$$ $$ = P(X=k)P(\Theta = 0) + P(Y = k)P(\Theta = 1)$$ due to the independence of $\Theta$ from $X,Y$ so $$P(Z = k) = p P(X = k) + (1-p) P(Y = k)$$ The MGF is given by $$\mathbf{E}[e^{s Z}] = \sum_{k \in \chi} e^{s k} P(Z = k) = \sum_{k} p e^{s k} P(X = k) + (1-p)e^{s k} P(Y = k)$$ where $\chi$ is the set of possible values of $Z$, so $$\mathbf{E}[e^{s Z}] = p \mathbf{E}[e^{s X}] + (1-p) \mathbf{E}[e^{s Y}]$$

V.C.
  • 43
  • 7
2

Similar to the other answer, but using the conditional expectation more explicitly.

Flip a coin $\newcommand{\E}{\mathbb{E}} U\sim\mathrm{Ber}(p)$ independently of $X$ and $Y$. Let $Z=UX+(1-U)Y$. Easy to prove: $$\E\left[e^{tZ}\mid U\right]=U\,\E\left[e^{tX}\right]+(1-U)\,\E\left[e^{tY}\right] \, ,$$ almost surely. Tower property: $\E\left[e^{tZ}\right]=\E\left[\E\left[e^{tZ}\mid U\right]\right]=p\,M_X(t)+(1-p)\,M_Y(t)$. $\quad\square$

Take a look at this question if you have any doubts about $\E[e^{tZ}\mid U]$ being itself a random variable which is a function of $U$.

Zen
  • 21,786
  • 3
  • 72
  • 114