1

If $X $ and $Y$ are independent random variables such that $X+Y$ has the same distribution as $X$ then is it always true that $P(Y=0)=1\ ?$

[This is actually a fact that a researcher used (without proof) while giving a lecture on his new paper that I was attending.]

Glen_b
  • 257,508
  • 32
  • 553
  • 939
Qwerty
  • 1,047
  • 10
  • 22
  • 4
    This reads like a routine textbook exercise. Is it for some class? Consider the variance of the sum. What does it imply about $Y$? – Glen_b Nov 05 '16 at 23:49
  • @Glen_b In this problem ,is it right to assume that $E[X]$ and $Var[X]$ is finite? – Qwerty Nov 06 '16 at 05:15
  • 1
    This is actually a fact that a researcher used (without proof) while giving a lecture(I was attending) on his new paper. That's why I had not added the `[self-study]` tag. – Qwerty Nov 06 '16 at 05:20
  • In relation to the question about assuming finite moments (which is a good point) -- what's okay to assume about the problem is up to you -- it was a suggestive hint, not an answer. You say it's a research problem, so it's up to you to say what the conditions were, in the question. However, even if your problem is a more general one, the variance argument may give you a good sense of how to generalize it. [How does the research problem arise/what research area is this? Was it an application where the variables are bounded for example? Many real-world variables are.] – Glen_b Nov 06 '16 at 05:37
  • @Glen_b I remember the researcher saying that , it is a generally true fact that `If X and Y are independent random ......` – Qwerty Nov 06 '16 at 05:51
  • 1
    In the case of more general variables, one might perhaps move to using characteristic functions. ... in fact it's easy to do so. I have posted an answer. – Glen_b Nov 06 '16 at 05:56
  • Similar: https://stats.stackexchange.com/q/303525/119261 – StubbornAtom Jun 28 '20 at 07:06

2 Answers2

4

The variance argument isn't hard to make more general:

Consider the characteristic function of the sum $\phi_{X+Y}(t) = \phi_X(t) \phi_Y(t)$

But since $X$ and $X+Y$ have the same distribution $\phi_{X+Y}(t) = \phi_X(t)$.

Hence $\phi_Y(t) = \phi_{X+Y}(t) /\phi_X(t) = 1$

This is the characteristic function of a degenerate distribution with all its mass at $0$.

Glen_b
  • 257,508
  • 32
  • 553
  • 939
  • Basically you are cancelling out $\phi_X(t)$ from $\phi_X(t)\cdot \phi_Y(t)=\phi_X(t)$ . Right? (This might be a silly question but still I need to know) But How can you guarantee $\phi_X(t)$ will never be zero? – Qwerty Nov 06 '16 at 06:12
  • Yep, good point. I'll have to come back and fix that. I can't fix it right now though. – Glen_b Nov 06 '16 at 06:22
  • 1
    Oh, actually, isolated points shouldn't make a difference when we do the integration to take it back, so as long as $\phi_X$ is not zero on an interval/region it shouldn't matter. Anyway, I'll come back to it – Glen_b Nov 06 '16 at 06:27
  • Can you please give a complete fix of this today itself?I have a contiuing lecture(of that researcher) to attend today, at $20:00$ hrs. Its now $12:39$ here. – Qwerty Nov 06 '16 at 07:10
  • I am sorry if You felt like I was forcing you or something. I just meant a request. – Qwerty Nov 06 '16 at 09:44
  • 1
    @Glen_b: a simple argument completing yours is that $\phi_X(0)=\phi_Y(0)=1$ and that both functions are [non-vanishing in a neighbourhood of zero due to the continuity of the characteristic function](https://en.wikipedia.org/wiki/Characteristic_function_(probability_theory)#Properties). – Xi'an Nov 06 '16 at 11:06
  • 2
    @Qwerty: I do not understand how missing the complete argument can prevent you from attending the next lecture. Of course, if this was an homework assignment and you had to return it by this deadline, it would explain for the urgency... – Xi'an Nov 06 '16 at 11:08
  • @Xi'an A man lectures a part of a big proof on one day, and promises to complete the next half on the other day. If I don't thoroughly understand the proof of the first day itself, how do you expect me to follow the lecture today? – Qwerty Nov 06 '16 at 11:11
  • @Xi'an How does that guarentee that $\phi_X(t)$ will never be zero? (The cancelling out can be done only if $\phi_X(t)$ is nonzero $\forall t\in \Bbb{R}$ ,right?) – Qwerty Nov 06 '16 at 11:14
  • @Qwerty it doesn't guarantee that $\phi_X(t)$ will never be $0$, but the point Xi'an was alluding to was that it doesn't need to. – Glen_b Nov 06 '16 at 11:17
  • @Glen_b Could you please explain some more? – Qwerty Nov 06 '16 at 11:19
  • If $\phi_Y$ is equal to one in a neighbourhood of $0$, this is sufficient [by a complex analysis argument] to show that $\phi_Y$ is constant everywhere and hence that $Y$ is constant. – Xi'an Nov 06 '16 at 11:26
  • 1
    @Qwerty I'll have to try to do it properly later in an edit to the answer, bu I can outline the gist of it (you may have seen I hinted as such an argument in my first comment under my answer, before I edited it back out) $\phi_X$ must be non-vanishing in a neighborhood of $0$, so $\phi_Y$ must then be $1$ in that neighborhood of $0$. That would actually be enough to put all the probability for $Y$ at 0. (e.g. it means that all the moments must be zero.) – Glen_b Nov 06 '16 at 11:31
  • 1
    Oh, Xi'an has a better argument there. – Glen_b Nov 06 '16 at 11:32
  • @Glen_b Looking forward to your edit.. – Qwerty Nov 06 '16 at 12:06
1

My approach. We are essentially comparing $X$ with $X+Y$.

The variance of $X+Y$ is, given independence:

$$\begin{align} \text{Var}(X+Y)&=\text{Var}(X)+\text{Var}(Y)\\ \end{align}$$

We know that $X$ has variance $\text{Var}(X)$, which implies that $\text{Var}(Y)=0$. What this tells us is that $Y$ is (almost surely?) a constant.

Now, taking the expectaton of $X+Y$:

$$\begin{align} \mathbb{E}[X+Y]&=\mathbb{E}[X]+\mathbb{E}[Y]\\ \end{align}$$

But we know that $X$ has expectation of $\mathbb{E}[X]$, which implies $\mathbb{E}[Y]=0$.

So, $Y$ is a constant with expectation zero. This implies that $\text{Pr}(Y=0)=1$.

epp
  • 2,372
  • 2
  • 12
  • 31
  • Nice answer, StatsPlease. But did you see the comments by Glen_b above? It was asked that they show their progress first. – Xu Wang Nov 06 '16 at 04:32
  • @XuWang Should I delete my answer? – epp Nov 06 '16 at 04:33
  • I don't know. I am new here too :) – Xu Wang Nov 06 '16 at 04:34
  • Please don't delete your answer! – Qwerty Nov 06 '16 at 05:10
  • 1
    And I think, one thing you have assumed wrong. $E[X]+E[Y]=E[X]\implies E[Y]=0$ if and only $E[X]$ is finite. But nowher has it been said so!. Take an example the Cauchy distribution! – Qwerty Nov 06 '16 at 05:12
  • 1
    @StatsPlease You don't have to delete it now you've posted it. But be aware of our policy on routine bookwork questions for future reference. (See the `self-study` [guidelines](http://stats.stackexchange.com/tags/self-study/info) for example). – Glen_b Nov 06 '16 at 05:29
  • @Qwerty Does $X$ having finite expectation (or variance) matter? I don't presume to be an expert. – epp Nov 06 '16 at 05:40
  • @StatsPlease If $X$ doesnot have finite expectation, how do you say that $E[Y]=0\ ?$ – Qwerty Nov 06 '16 at 05:49
  • @Qwerty Perhaps my grasp on mathematics has weakened over the years. If $a+b=a$, doesn't this imply, irrespective of the value of $a$ (because it is on either side), that $b=0$? There could be some detail I'm missing. – epp Nov 06 '16 at 05:58
  • Take $a\to\infty$. Whatever value $b$ takes,it will satisfy.. – Qwerty Nov 06 '16 at 06:14