4

Let $X$ and $Y$ be two i.i.d random variables. I am trying to prove that $\mathbb{P}(X<Y)=\mathbb{P}(Y<X)$. The author claims that this is true by symmetry. But I am trying to prove this in a rigorous way.

I haven't studied joint distributions yet. Here is my attempt for the proof.

Let $\Omega$ be the sample space of interest. Then we have the following.

$\mathbb{P}(X<Y)=\mathbb{P}\left(\{\omega\in\Omega:X(\omega)<Y(\omega)\}\right)$

$\mathbb{P}(Y<X)=\mathbb{P}\left(\{\omega\in\Omega:Y(\omega)<X(\omega)\}\right)$

One way to show that $\mathbb{P}(X<Y)=\mathbb{P}(Y<X)$ is to show that the following equation holds:

$$\begin{equation*}\mathbb{P}\left(\{\omega\in\Omega:X(\omega)<Y(\omega)\}\right)=\mathbb{P}\left(\{\omega\in\Omega:Y(\omega)<X(\omega)\}\right)\end{equation*}\cdots\cdots(1)$$

However, I couldn't continue further using the above approach. Instead, I am using the following method, but it works only if $X$ and $Y$ are discrete random variables.

Let $A:=\{x\in\mathbb{R}:X(x)>0\}$. Then $A$ is the support of $X$, and also $Y$ since $X$, $Y$ are i.i.d.

Let $B:=\{(x,y)\in A\times A:x<y\}$.

We then have

$\begin{equation*}\begin{split}\mathbb{P}(X<Y)&=\sum_{(x,y)\in B}\mathbb{P}(X=x,Y=y)\\&=\sum_{(x,y)\in B}\mathbb{P}(X=x)\mathbb{P}(Y=y)\quad\quad\quad \left(X,Y \text{ are independent}\right)\\ &=\sum_{(x,y)\in B}\mathbb{P}(Y=x)\mathbb{P}(X=y)\quad\quad\quad \left(X,Y \text{ are identically distrbuted}\right)\\&=\mathbb{P}(Y<X)\end{split}\end{equation*}$

The above proof holds when $X$ and $Y$ are discrete. But how do I approach this problem when $X$ and $Y$ are continuous random variables? Also, is it possible prove the above fact directly by showing that the equation $(1)$ holds?

  • 1
    See http://stats.stackexchange.com/questions/256444/why-are-all-the-permutations-of-i-i-d-samples-from-a-continuous-distribution-eq/256580#256580 – whuber Feb 02 '17 at 04:22
  • I'm new to stats, so bear with me. Shouldn't it be the case that your argument with sums goes through with all the sums replaced by integrals for the continuous case? $$P(X < Y) = \int_{\Omega'} P(X=a,Y=b)dP = \int_{\Omega'} P(X=a)P(Y=b)dP=...$$ where $$\Omega'= \{\omega : X(\omega) < Y(\omega)\}$$ – Andrew Maurer Feb 02 '17 at 05:13

1 Answers1

1

It is not necessary to frame things in measure theoretic terms for this kind of question. Let $X$ and $Y$ be IID random variables with some arbitrary common distribution function $F$. From the IID assumption, we have:

$$\mathbb{P}(X \leqslant x, Y \leqslant y) = F(x) \cdot F(y) \quad \quad \quad \text{for all } x,y \in \mathbb{R}.$$

We will not make any assumption about whether the distribution is discrete or continuous (or a mixture), and so we will use Lebesgue-Stieltjes integration for marginalising of the random variables. Using the law of total probability (expressed using Lebesgue-Stieltjes integration) we have:

$$\begin{equation} \begin{aligned} \mathbb{P}(X \leqslant Y) &= \int \limits_\mathbb{R} \mathbb{P}(X \leqslant y|Y=y) \ d \mathbb{P}(Y \leqslant y) \\[6pt] &= \int \limits_\mathbb{R} \mathbb{P}(X \leqslant y) \ d \mathbb{P}(Y \leqslant y) & \text{(by independence)} \\[6pt] &= \int \limits_\mathbb{R} F(y) \ dF(y) \\[6pt] &= \int \limits_\mathbb{R} \mathbb{P}(Y \leqslant y) \ d \mathbb{P}(X \leqslant y) \\[6pt] &= \int \limits_\mathbb{R} \mathbb{P}(Y \leqslant y|X=y) \ d \mathbb{P}(X \leqslant y) & \text{(by independence)} \\[6pt] &= \mathbb{P}(Y \leqslant X). \end{aligned} \end{equation}$$

This establishes that $\mathbb{P}(X \leqslant Y) = \mathbb{P}(Y \leqslant X)$. As a simple corollary of this fact, we also have:

$$\begin{equation} \begin{aligned} \mathbb{P}(X < Y) = \mathbb{P}(X \leqslant Y) - \mathbb{P}(X = Y) = \mathbb{P}(Y \leqslant X) - \mathbb{P}(Y = X) = \mathbb{P}(Y < X). \end{aligned} \end{equation}$$

Ben
  • 91,027
  • 3
  • 150
  • 376