6

That $X-Y$ should be symmetrically distributed for iid $X,Y$ is obvious simply by interchanging the roles of $X$ and $Y$ -- informally we might argue

Let $Z=X-Y$ have distribution $F$. The roles of which observation was called $X$ and which $Y$ is arbitrary; therefore $-Z=Y-X$ must have the same distribution. If $-Z$ and $Z$ have the same distribution then the distribution is symmetric (about $0$).

However I have a vague recollection of having encountered an odd counterexample at some point (I wonder if it might perhaps have been in the Counterexamples book by Romano and Siegel).

Is there some subtlety in the above outlined argument (symmetry in the roles of $X$ and $Y$ implies symmetry of the distribution function) that goes astray in some edge case, or is the implied more formal version of the argument solid? (indicating that I am simply misremembering the notional existence of an exception)

I can't see any obvious way to break it but "I don't see how to do it" doesn't mean much as an argument. I suppose that it may be I have misrecalled some detail; perhaps the exception actually resulted because there wasn't independence in the original formulation (in which case I believe I could find an exception myself). [Edit: Indeed, I have done so now]

I expect the answer is "no, you're misremembering, obviously it's symmetric" but this nags at me now and then and I worry my notions are flawed in some way.

Glen_b
  • 257,508
  • 32
  • 553
  • 939
  • This seems neither subtle nor deep. If you can exchange $X$ and $Y$ and thereby obtain the same joint distribution, then under the exchange $X-Y$ becomes $-(X-Y)$ and it is immediate that $X-Y$ is symmetric about zero. – whuber Oct 15 '17 at 14:35
  • @whuber I agree that the notion that there should be symmetry is not at all deep; indeed you seem to just give the same outline of argument I did near the start of the question. The issue of whether there was some subtlety would underlie only a positive answer to "did I miss something there?" in my argument (though the answer is, as you say, "no") -- for that argument to have been wrong in some way would seem to require something subtle. – Glen_b Oct 15 '17 at 20:09
  • My point is that the result follows directly from the [definition](https://stats.stackexchange.com/a/29010/919): it's a straightforward application. There's no need to do any calculation, invoke cfs or cgfs, etc. – whuber Oct 15 '17 at 21:22
  • I don't disagree with any of that; I am clear that the doubts I held in that argument were unfounded, but I can't pretend I didn't harbour them anyway. – Glen_b Oct 16 '17 at 01:34
  • 1
    An interesting example for a somehat different question to the one here (omitting "independent" and dealing with the sum) is discussed in Chen and Shepp 1983, "On the sum of symmetric random variables", *Am. Stat.* Vol. 37, No 3 .p237 - with an example of two identically distributed Cauchy r.v.s whose sum is *also* symmetric -- but not about 0. – Glen_b Feb 25 '19 at 07:14

2 Answers2

6

Corrected after @Glen_b pointed out a glaring error. Sloppy proof, but should work.

I think we can prove this using characteristic functions.

Let X, Y be iid. Let Z = Y-X Then,

$\phi_{X-Y}(t) = E[e^{it(X-Y)}] = \phi_X(t)\phi_{-Y}(t)$.

Similar to the CDF, the characteristic function of X uniquely characterizes the distribution of X, and it exists for any real-valued random variable. This implies that $\phi_X(t) \equiv \phi_Y(t)$.

This, along with properties of the characteristic function under linear transformation implies that

$\phi_{-X}(t) =\phi_{X}(-t) = \phi_{Y}(-t)=\phi_{-Y}(t) $.

In turn, this implies that

$\phi_X(t)\phi_{-Y}(t) = \phi_X(t)\phi_{-X}(t) = \phi_Y(t)\phi_{-X}(t) =\phi_{Y-X}(t) $,

so that $\phi_{X-Y}(t)=\phi_{Y-X}(t)$ and $Z \sim -Z$.

David Kozak
  • 1,563
  • 8
  • 20
4

Just to clear up the source of my own confusion, I managed to coax just enough (about 4 lines!) out of Google books to resolve the origin of my doubt. It was from Romano and Siegel* and what they actually have there is:

4.34 Identically distributed random variables such that their difference does not have a symmetric distribution

If $X$ and $Y$ are independent and identically distributed, then $X-Y$ has a symmetric distribution about zero. The independence assumption cannot be dropped in general. (However, if $X$ and $Y$ are exchangeable, then $X-Y$ does have a symmetric distribution.)

A simple counterexample I came up with is $X\sim U[0,3)$ and $Y=(X-1) \text{ mod } 3$, so $Y$ is also uniform on $[0,3)$ and for which $X-Y$ takes the value $1$ with probability $\frac23$ and $-2$ with probability $\frac13$. This isn't the one I thought of just after I posted the question, but once you have one, further counterexamples are easy to think up and this one is simpler to explain. (Edit: in the end I managed to see their counterexample for the dependent case; it's fine - a simple bivariate example on $\{-1,0,1\}^2$ - but mine's simpler to express so I'll leave it there.)

Note here that $(X,Y)$ doesn't have the same distribution as $(Y,X)$ -- so we don't have the exchangeability that R&S mention, which is why the asymmetry is possible. Note also that the informal argument in my question - "interchange the roles of $X$ and $Y$" - quite directly relied on exchangeability and we can see from that outline why that weaker condition should be sufficient to get that $X-Y$ and $Y-X$ have the same distribution.

* Romano, J.P. and Siegel, A.F. (1986), Counterexamples in Probability And Statistics, (Wadsworth and Brooks/Cole Statistics/Probability Series)
(my question was based on a recollection from reading a little of it in late 1986... so it's no surprise I was a little fuzzy on the details)

Glen_b
  • 257,508
  • 32
  • 553
  • 939
  • Out of curiosity, what prompted the recollection? And, technically you also dropped the "identically distributed" assumption in your example, didn't you? I can't think of a counterexample that only violates independence off the top of my head, but believe that there would be. – David Kozak Oct 15 '17 at 00:44
  • 1. What prompted my recollection this time was writing a comment under another answer about symmetry of $X-Y$ for independent identically distributed $X,Y$ (in relation to the signed rank test) -- I was suddenly given pause that maybe I had seen a weird counterexample, a doubt that has arisen before. I figured it was time to just resolve it and while I could just have gone and done it (indeed I arrived at what is essentially your final argument shortly after posting my question), I figured it was worth having a question up anyway. ...... ctd – Glen_b Oct 15 '17 at 00:59
  • ctd ... 2. No, in my example both $X$ and $Y$ are $U[0,3)$. If you have R, try this: `x=runif(10000,0,3);` `y=(x-1) %% 3;` `par(mfrow=c(2,2));` `hist(x,n=30);` `hist(y,n=30);` `hist(x-y,n=30);` `plot(x,y)` ... .which [illustration](https://i.stack.imgur.com/Xv5uH.png) I think makes the situation clear enough. – Glen_b Oct 15 '17 at 00:59
  • Aha clever. Running a bit slow today apparently – David Kozak Oct 15 '17 at 01:05