4

As a newbie in probability, I am recently cleaning my understandings about Gaussian distribution.

I know that

  1. If $X$ and $Y$ are jointly Gaussian, then $aX+bY$ ($a$ and $b$ are both constant) is also Gaussian.
  2. If $X$ and $Y$ are Gaussian and uncorrelated (hence independent), then $aX+bY$ ($a$ and $b$ are both constant) is also Gaussian.

My question

What about $X$ and $Y$ are only Gaussian themselves (NOT jointly Gaussian)? Without assuming uncorrelatedness (or independence), can we still say $aX+bY$ is Gaussian?

If not, does "$X$ and $Y$ are orthogonal" change our conclusion?

Sibbs Gambling
  • 2,208
  • 5
  • 20
  • 42
  • Check this answer: http://stats.stackexchange.com/questions/19948/what-is-the-distribution-of-the-sum-of-non-i-i-d-gaussian-variates?rq=1 – Tim Nov 28 '14 at 08:42
  • @Tim That question and its answers are built upon "$X$ and $Y$ are jointly Gaussian", aren't they? – Sibbs Gambling Nov 28 '14 at 08:57
  • 1
    What you know ain't so. Being uncorrelated isn't sufficient. For a counterexample see [here](http://stats.stackexchange.com/questions/125648/transformation-chi-squared-to-normal-distribution/125653#125653) Many other counterexamples can be found here on CV. – Glen_b Nov 28 '14 at 10:58
  • Actually I think you should unfix it, since fixing it invalidates some of the responses you have. It's actually more useful to leave it that way. – Glen_b Nov 28 '14 at 11:08
  • 2
    Your 2. is a subset of 1. because if $X$ and $Y$ are marginally Gaussian _and_ independent, then they are also jointly Gaussian. – Dilip Sarwate Nov 28 '14 at 15:31

2 Answers2

8

No and this is a common fallacy. People tend to forget that the sum of two Gaussian is a Gaussian only if $X$ and $Y$ are independent or jointly normal.

Here is a nice explanation.

Salvador Dali
  • 1,029
  • 2
  • 11
  • 17
  • So if $X$ and $Y$ are orthogonal, this doesn't change the conclusion? Can you give an example illustrating this? – Sibbs Gambling Nov 28 '14 at 08:44
  • @FarticlePilter based on what I remember orthogonal simply means that they are uncorrelated (which actually does not mean that they are independent). I am looking for some illustration to my example. – Salvador Dali Nov 28 '14 at 08:50
  • No, orthogonality is not uncorrelatedness. They are orthogonal, if $E\{XY\}=0$. Only when $E\{X\}=0$ or $E\{Y\}=0$, orthogonality is uncorrelatedness. – Sibbs Gambling Nov 28 '14 at 08:52
  • 3
    Isn't "independent or jointly normal" redundant here? If X and Y are independent and normally distributed then they're jointly normal with a diagonal covariance matrix, right? – Brian Borchers Nov 28 '14 at 19:40
  • @BrianBorchers Yes, it is redundant to consider the case of independent variables separately. See, for example, my comment on the man question. But it is shibboleth murmured by many users of statistics (and possibly some practitioners of statistics as well) that independent Gaussian random variables must be treated separately from jointly Gaussian random variables – Dilip Sarwate Nov 29 '14 at 04:26
  • 1
    The site is no longer there. Does anyone have a copy of the intended PDF? – adunaic Feb 04 '20 at 16:26
  • @adunaic, and maybe others (even though I'm a little late to this question): the site still exists, just not the pdf, i.e., going to https://planetmath.org/ you can see that they have a github repository where they have a lot of tex files, probably the ones used to generate pdfs like the one that isn't in the site anymore. For instance, this one might be the explanation to this answer: https://github.com/planetmath/62_Statistics/blob/master/62E15-SumsOfNormalRandomVariablesNeedNotBeNormal.tex (I don't know, haven't checked them all yet) – Jose Ramirez Nov 25 '20 at 12:44
0

You are saying in your second point “uncorrelated (hence independent)”. It’s not quite the case. You can have two Gaussian distributions both uncorrelated and dependent. Uncorrelated implies independence only if they are jointly Gaussian.

Omar
  • 1