9

For a nonnegative random variable $X$, how to prove that $E(X^n)^{\frac1n}$ is nondecreasing in $n$?

amoeba
  • 93,463
  • 28
  • 275
  • 317
Dhamnekar Winod
  • 806
  • 8
  • 17

1 Answers1

7

Write $p$ in place of $n$ to emphasize it can be any positive real number, rather than just an integer as suggested by "$n$".

Let's go through some standard preliminary transformations to simplify subsequent calculations. It makes no difference to the result to rescale $X$. The result is trivial if $X$ is almost everywhere zero, so assume $\mathbb{E}(X)$ is nonzero, whence $\mathbb{E}(X^p)$ also is nonzero for all $p$. Now fix $p$ and divide $X$ by $\mathbb{E}(X^p)^{1/p}$ so that $$\mathbb{E}(X^p) = 1\tag{1},$$ with no loss of generality.

Here's how the reasoning might proceed when you're trying to figure it out the first time and you're trying not to work too hard. I will leave detailed justifications of each step to you.

The expression $\mathbb{E}(X^p)^{1/p}$ is nondecreasing if and only if its logarithm is nondecreasing. That log is differentiable and therefore is nondecreasing if and only if its derivative is non-negative. Exploiting $(1)$ we may compute (by differentiating within the expectation) this derivative as

$$\frac{d}{dp}\log\left( \mathbb{E}(X^p)^{1/p} \right) = -\frac{1}{p^2}\log\mathbb{E}(X^p) + \frac{\mathbb{E}(X^p \log X)}{\mathbb{E}(X^p)} = \frac{1}{p}\mathbb{E}(X^p \log(X^p)).$$

Writing $Y=X^p$, the right hand side is non-negative if and only if $$\mathbb{E}(Y\log(Y)) \ge 0.$$ But this is an immediate consequence of Jensen's Inequality applied to the function $f(y) = y\log(y)$ (continuous on the nonnegative reals and differentiable on the positive reals), because differentiating twice shows $$f^{\prime\prime}(y) = \frac{1}{y}\gt 0$$ for $y\gt 0$, whence $f$ is a convex function on the non-negative reals, yielding

$$\mathbb{E}(Y \log Y) = \mathbb{E}(f(Y)) \ge f\left(\mathbb{E}(Y)\right) = f(1) = 0,$$

QED.


Edit

Edward Nelson provides a wonderfully succinct demonstration. As a matter of (standard) notation, define $||x||_p = \mathbb{E}(|x|^p)^{1/p}$ for $1 \lt p \lt \infty$ (and $||x||_\infty = \sup |x|$). Upon observing that the function $f(x) = |x|^p$ is convex, he applies Jensen's Inequality to conclude

$$|\mathbb{E}(x)|^p \le \mathbb{E}(|x|^p).$$

Here is the rest of the demonstration in his own words:

Applied to $|x|$ this gives $$||x||_1 \le ||x||_p,$$ and applied to $|x|^r$, where $1 \le r \lt \infty$, this gives $$||x||_r \le ||x||_{rp},$$ so that $||x||_p$ is an increasing function of $p$ for $1 \le p \le \infty$.

Reference

Edward Nelson, Radically Elementary Probability Theory. Princeton University Press (1987): p. 5.

whuber
  • 281,159
  • 54
  • 637
  • 1,101
  • Would you explain me how did you compute the derivative of $log(E(X^p)^{\frac{1}{p}})$ – Dhamnekar Winod Nov 05 '16 at 15:59
  • I used the product rule, because $$\log\left(\mathbb{E}(X^p)^{1/p}\right)=\frac{1}{p}\ \log\mathbb{E}(X^p).$$ I differentiated the second factor in the product by [differentiating under the integral sign.](https://en.wikipedia.org/wiki/Leibniz_integral_rule#General_form:_Differentiation_under_the_integral_sign) – whuber Nov 05 '16 at 20:05
  • How did you arrive at $E(X^p)=1$.You wrote you divide X by $E(X^p)^{\frac{1}{p}}$ – Dhamnekar Winod Nov 06 '16 at 13:40
  • Why didn't you multiply the second term in the derivative of $log(E(X^p)^(\frac{1}{p})$ by $\frac1p$ – Dhamnekar Winod Nov 10 '16 at 13:06
  • I did: it canceled another factor of $p$. But does it matter for the result? After all, we only need to know the sign of the derivative. – whuber Nov 10 '16 at 14:28