21

I was doing some work in scipy and a conversation came up w/a member of the core scipy group whether a non-negative discrete random variable can have a undefined moment. I think he is correct but do not have a proof handy. Can anyone show/prove this claim? (or if this claim is not true disprove)

I don't have an example handy if the discrete random variable has support on $\mathbb{Z}$ but it seems that some discretized version of the Cauchy distribution should serve as an example to get an undefined moment. The condition of non-negativity (perhaps including $0$) is what seems to make the problem challenging (at least for me).

Lucas Roberts
  • 3,819
  • 16
  • 45

4 Answers4

39

Here's a famous example: Let $X$ take value $2^k$ with probability $2^{-k}$, for each integer $k\ge1$. Then $X$ takes values in (a subset of) the positive integers; the total mass is $\sum_{k=1}^\infty 2^{-k}=1$, but its expectation is $$E(X) = \sum_{k=1}^\infty 2^k P(X=2^k) = \sum_{k=1}^\infty 1 = \infty. $$ This random variable $X$ arises in the St. Petersburg paradox.

grand_chat
  • 2,632
  • 1
  • 8
  • 11
16

Let the CDF $F$ equal $1-1/n$ at the integers $n=1,2,\ldots,$ piecewise constant everywhere else, and subject to all criteria to be a CDF. The expectation is

$$\int_{0}^\infty (1-F(x))\mathrm{d}x = 1/2 + 1/3 + 1/4 + \cdots$$

which diverges. In this sense the first moment (and therefore all higher moments) is infinite. (See remarks at the end for further elaboration.)


If you're uncomfortable with this notation, note that for $n=1,2,3,\ldots,$

$${\Pr}_{F}(n) = \frac{1}{n} - \frac{1}{n+1.}$$

This defines a probability distribution since each term is positive and $$\sum_{n=1}^\infty {\Pr}_{F}(n) = \sum_{n=1}^\infty \left(\frac{1}{n} - \frac{1}{n+1}\right) = \lim_{n\to \infty} 1 - \frac{1}{n+1} = 1.$$

The expectation is

$$\sum_{n=1}^\infty n\,{\Pr}_{F}(n) = \sum_{n=1}^\infty n\left(\frac{1}{n} - \frac{1}{n+1}\right) =\sum_{n=1}^\infty \frac{1}{n+1} = 1/2 + 1/3 + 1/4 + \cdots$$

which diverges.

This way of expressing the answer it makes it clear that all solutions are obtained by such divergent series. Indeed, if you would like the distribution to be supported on some subset of the positive values $x_1, x_2, \ldots, x_n, \ldots,$ with probabilities $p_1, p_2, \ldots$ summing to unity, then for the expectation to diverge the series which expresses it, namely

$$(a_n) = (x_n p_n),$$

must have divergent partial sums.

Conversely, every divergent series $(a_n)$ of non-negative numbers is associated with many discrete positive distributions having divergent expectation. For instance, given $(a_n)$ you could apply the following algorithm to determine sequences $(x_n)$ and $(p_n)$. Begin by setting $q_n = 2^{-n}$ and $y_n = 2^n a_n$ for $n=1, 2, \ldots.$ Define $\Omega$ to be the set of all $y_n$ that arise in this way, index its elements as $\Omega=\{\omega_1, \omega_2, \ldots, \omega_i, \ldots\},$ and define a probability distribution on $\Omega$ by

$$\Pr(\omega_i) = \sum_{n \mid y_n = \omega_i}q_n.$$

This works because the sum of the $p_n$ equals the sum of the $q_n,$ which is $1,$ and $\Omega$ has at most a countable number of positive elements.

As an example, the series $(a_n) = (1, 1/2, 1, 1/2, \ldots)$ obviously diverges. The algorithm gives

$$y_1 = 2a_1 = 2;\ y_2 = 2^2 a_2 = 2;\ y_3 = 2^3 a_3 = 8; \ldots$$

Thus $$\Omega = \{2, 8, 32, 128, \ldots, 2^{2n+1},\ldots\}$$

is the set of odd positive powers of $2$ and $$p_1 = q_1 + q_2 = 3/4;\ p_2 = q_3 + q_4 = 3/16;\ p_3 = q_5 + q_6 = 3/64; \ldots$$


About infinite and non-existent moments

When all the values are positive, there is no such thing as an "undefined" moment: moments all exist, but they can be infinite in the sense of a divergent sum (or integral), as shown at the outset of this answer.

Generally, all moments are defined for positive random variables, because the sum or integral that expresses them either converges absolutely or it diverges (is "infinite.") In contrast to that, moments can become undefined for variables that take on positive and negative values, because--by definition of the Lebesgue integral--the moment is the difference between a moment of the positive part and a moment of the absolute value of the negative part. If both those are infinite, convergence is not absolute and you face the problem of subtracting an infinity from an infinity: that does not exist.

whuber
  • 281,159
  • 54
  • 637
  • 1,101
  • does this argument give an example of an infinite moment or an undefined moment? I'm looking for an undefined moment. Maybe there is a subtlety of undefined versus infinite moments that I am missing to fully understand your answer. – Lucas Roberts Sep 14 '18 at 12:07
  • 2
    When all the values are positive, there is no such thing as an "undefined" moment: moments all exist, but they can be infinite. – whuber Sep 14 '18 at 14:36
  • I see now that I wrote "does not exist" is the question heading which is interpreted as infinite moment. I was in fact interested in undefined moment, as indicated by the text of the post where I wrote: "undefined moment." on the second line. – Lucas Roberts Sep 14 '18 at 15:20
  • 4
    *All* moments are defined for positive random variables. Some may be infinite, that's all. Moments can become undefined for variables that take on positive and negative values, because--by definition of the Lebesgue integral--the moment is the difference between a moment of the positive part and a moment of the absolute value of the negative part. If both those are infinite, you face the problem of subtracting an infinity from an infinity: *that* does not exist. – whuber Sep 14 '18 at 15:54
  • 1
    "All moments are defined for positive random variables. Some may be infinite, that's all." Given that the title of the question concerns moments *not existing*, I think a lot of this comment deserves to be edited into the answer! – Silverfish Sep 14 '18 at 16:50
  • @whuber, yes I agree w/@Silverfish comment. The comment/explanation really concisely clarifies the issue. If you could add this at the top of your answer I will mark yours correct. Please keep the rest of the answer though, the example given is truly wonderful and would be a shame to lose. – Lucas Roberts Sep 14 '18 at 22:17
  • @whuber, thanks for adding in the comment to the answer. Marking as correct. There are some other really interesting responses on this thread too. I'm glad to the the St Peterburg paradox get put up as (and get upvoted so much). – Lucas Roberts Sep 15 '18 at 18:11
  • 1
    I guess I could've found the answer buried in this post:https://stats.stackexchange.com/questions/243150/does-the-k-th-moment-exists-when-exk-is-infinite-in-ether-one-positive-or?rq=1 – Lucas Roberts Sep 15 '18 at 18:12
9
  1. The zeta distribution is a fairly well-known discrete distribution on the positive integers that doesn't have finite mean (for $1<\theta\leq 2$) .

    $P(X=x|\theta)={{\frac {1}{\zeta (\theta)}}x^{-\theta}}\,,\: x=1,2,...,\:\theta>1$

    where the normalizing constant involves $\zeta(\cdot)$, the Riemann zeta function

    (edit: The case $\theta=2$ is very similar to whuber's answer)

    Another distribution with similar tail behaviour is the Yule-Simon distribution.

  2. Another example would be the beta-negative binomial distribution with $0<\alpha\leq 1$:

    $P(X=x|\alpha ,\beta ,r)={\frac {\Gamma (r+x)}{x!\;\Gamma (r)}}{\frac {\mathrm{B} (\alpha +r,\beta +x)}{\mathrm{B} (\alpha ,\beta )}}\,,\:x=0,1,2...\:\alpha,\beta,r > 0$

Glen_b
  • 257,508
  • 32
  • 553
  • 939
0

some discretized version of the Cauchy distribution

Yes, if you take $p(n)$ as being the average value of the Cauchy distribution in the interval around $n$, then clearly its zeroth moment is the same as that of the Cauchy distribution, and its first moment asymptotically approaches the first moment of the Cauchy distribution. As far as "the interval around $n$", it doesn't really matter how you define that; take $(n-1,n]$, $[n,n+1)$, $[n-.5,n+.5)$, vel cetera, and it will work. For positive integers, you can also take $p(n) =\frac6{(n\pi)^2}$. The zeroth moment sums to one, and the first moment is the sum of $\frac6{n\pi^2}$, which diverges.

And in fact for any polynomial $p(n)$, there is some $c$ such that $\frac c {p(n)}$ sums to 1. If we then take the $k$th moment, where $k$ is the order of $p(n)$, that will diverge.

Acccumulation
  • 3,688
  • 5
  • 11