7

This is from the book Fundamentals of Probability with Stochastic Processes by Saeed Ghahramani, pages 249-250 which asserts, for any random variable $X$ that is non-negative, expectation of $X$ is

$$ E(X)= \int_{0}^{\infty} [1-F(t)]dt= \int_{0}^{\infty} P(X>t) dt $$

Where $F(t)$ is a cumulative distribution function. Somehow this is equal to $\int_{0}^{\infty}x. P(X=x)dx$ and I don't see it.Though the proof is provided in the book, I find it lacking and wasn't completely satisfied with it.

I would like to see a proof of the discrete analog of the above expression

$$ E(X) = \sum_{x=0}^{\infty}x. P(X=x)= \sum_{x=0}^{\infty} [1- F(x)] = \sum_{x=0}^{\infty} \sum_{y=0}^{x} P(Y=y) $$

This is a very basic probability question.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
Kiran Kuppa
  • 173
  • 1
  • 5
  • https://stats.stackexchange.com/questions/164788/expected-value-as-a-function-of-quantiles/164790#164790 – kjetil b halvorsen Sep 26 '17 at 13:51
  • 2
    This is the consequence of a basic integration by part. – Xi'an Sep 26 '17 at 15:13
  • 1
    @kjetil e.g. See https://stats.stackexchange.com/questions/18438/does-a-univariate-random-variables-mean-always-equal-the-integral-of-its-quanti ... and also several of the posts linked from there – Glen_b Sep 26 '17 at 15:34
  • 2
    For an explicit answer to the specific question being asked, see [this answer](https://math.stackexchange.com/a/64227/15941) on math.SE. – Dilip Sarwate Sep 26 '17 at 17:52
  • 2
    Anticipating questions like this, I wrote out a full and general demonstration of this result at https://stats.stackexchange.com/questions/222478/expectation-of-a-function-of-a-random-variable-from-cdf/222497#222497. It covers both the integral and the sum and describes some (weak) assumptions needed. – whuber Sep 26 '17 at 20:10

1 Answers1

9

The discrete case, assume that $X \ge 0$ takes non-negative integer values. Then we can write the expectation as $$ \DeclareMathOperator{\E}{\mathbb{E}} \DeclareMathOperator{\P}{\mathbb{P}} \E X = \sum_{k=0}^\infty k \P(X=k) $$ Now, we will first write this as a double sum, and then change the order of summation. Observe that $k = \sum_{j=0}^{k-1} 1$ (the case $k=0$ gives a lower upper than lower limit, we take that as the empty sum, which is zero). This gives $$ \E X = \sum_{k=0}^\infty \sum_{j=0}^{k-1} 1 \cdot \P(X=k) $$ Now, in this double sum we sum first on $j$, which clearly goes to $\infty$. Observe that in the inner summation the indices satisfy the inequality $$ 0 \le j \le k-1 $$ Solving that for $k$ gives $ k \ge j+1$, which then gives the limits of summation in the new inner sum: $$ \E X = \sum_{j=0}^\infty \sum_{k=j+1}^\infty \P(X=k) = \sum_{j=0}^\infty \P(X > j) $$ which is the result. The continuous case is similar.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
  • 1
    +1 This technique is called [summation by parts](https://en.wikipedia.org/wiki/Summation_by_parts). – whuber Sep 26 '17 at 20:12