5

The "Darth Vader rule" for the expected value of non-negative random variable is:

$$\mathbb{E}(X) = \int \limits_0^\infty (1-F_X(x)) \ dx.$$

This rule applies only to non-negative random variables. Is there any similar "integral rule" that gives the expected value of any random variable (including one that can be negative) in terms of the distribution function?

Ben
  • 91,027
  • 3
  • 150
  • 376
  • What makes you say the rule (sometimes called the "Darth Vader rule") only applies to the continuous case? – Glen_b Sep 27 '20 at 01:02
  • Thanks Glen --- I had not heard that name before. I have edited the question to add the name and remove the continuity requirement. – Ben Sep 27 '20 at 01:06
  • 1
    A standard proof is shown in this answer: https://math.stackexchange.com/a/2137144/321264. – StubbornAtom Sep 27 '20 at 07:47
  • 1
    This has been thoroughly treated in several other posts -- they're just hard to find. For one general account, see my post at https://stats.stackexchange.com/a/222497/919. Another way to conceive of the problem is to decompose any random variable into its positive and negative parts, apply this formula to each part separately, and reassemble the results: you get the obvious linear combination of two integrals. That's basically a one-line derivation. – whuber Sep 27 '20 at 12:27
  • Thanks @whuber --- that other post is very interesting. – Ben Sep 27 '20 at 12:37

1 Answers1

5

While the "Darth Vader rule" (a silly name) applies to any non-negative random variable, I am going to simplify the analysis by looking only at continuous random variables. Extension to discrete and mixed random variables should also be possible, but I will not pursue that here. In a related answer here we show a partial extension of the expectation rule. Specifically, it is shown that for an arbitrary continuous random variable $X$ and any constant $a \in \mathbb{R}$ you have the general rule:$^\dagger$

$$\mathbb{E}[\max(X-a,0)] = \int \limits_{a}^\infty (1-F_X(x)) \ dx.$$

We can go further by writing the expectation of an arbitrary continuous random variable $X$ as:

$$\begin{align} \mathbb{E}[X] &= \lim_{a\rightarrow -\infty} \mathbb{E}[\max(X,a)] \\[12pt] &= \lim_{a\rightarrow -\infty} \Big( a+\mathbb{E}[\max(X-a,0)] \Big) \\[12pt] &= \lim_{a\rightarrow -\infty} \Big( - \int \limits_{a}^0 \ dx + \mathbb{E}[\max(X-a,0)] \Big) \\[12pt] &= \lim_{a\rightarrow -\infty} \Bigg( - \int \limits_{a}^\infty \mathbb{I}(x < 0) \ dx + \int \limits_{a}^\infty (1-F_X(x)) \ dx \Bigg) \\[6pt] &= \lim_{a\rightarrow -\infty} \Bigg( \int \limits_{a}^\infty (1-\mathbb{I}(x < 0)-F_X(x)) \ dx \Bigg) \\[6pt] &= \lim_{a\rightarrow -\infty} \int \limits_{a}^\infty (\mathbb{I}(x \geqslant 0)-F_X(x)) \ dx \\[6pt] &= \int \limits_{-\infty}^\infty (\mathbb{I}(x \geqslant 0)-F_X(x)) \ dx. \\[6pt] \end{align}$$

In cases where the individual integrals are convergent, this can be written in simple form as:

$$\mathbb{E}[X] = \int \limits_0^\infty (1-F_X(x)) \ dx - \int \limits_0^\infty F_X(-x) \ dx.$$

This integral rule extends the Darth Vader rule for continuous non-negative random variables. (Extension for discrete random variables is similar, but you have to be a bit more careful with the boundaries of the integrals.) In the special case where $X$ is continuous and non-negative we have $F_X(-x) = 0$ for all $x < 0$ and so the second term in this equation vanishes, giving the standard expectation rule. I have not seen this integral expression in any textbooks or papers, so it does not seem to be one that is used much (if at all?) in statistical practice. Nevertheless, it does provide one possible extension of the standard integral rule to deal with random variables that can be negative.


$^\dagger$ In the special case where $X$ is non-negative and $a=0$ this reduces down to the standard expectation rule for non-negative random variables shown in the question.

Ben
  • 91,027
  • 3
  • 150
  • 376