0

Consider a random variable $X$ with the associated density function $f_X(x)$ and "zero" mean.

Define the following quantities:

(1) $E[X^2] := \int_{-\infty}^{+\infty} x^2 f_X(x) dx$

(2) $E[ |X| ] := \int_{-\infty}^{+\infty} |x| f_X(x) dx$

I see that $E[X^2]$ is the variance (noting that the mean is zero). But I have no idea if $E[|X|]$ is already well-known and useful in the probability context.

Anyway, here is the question: I wonder if there is some functional (in)equality between $E[X^2]$ and $E[|X|]$. Something of the following form: the existence of a mapping $\rho : \mathbb{R}_{\geq 0} \to \mathbb{R}_{\geq 0}$ such that $E[|X|] \leq \rho(E[X^2])$.

You may make a fair assumption on the density function $f_X(x)$ if required. The zero mean assumption is made to make the life easier. You may also drop it if necessary.

Navid Noroozi
  • 61
  • 1
  • 3
  • 1
    This is a special case of https://stats.stackexchange.com/questions/244202. – whuber Apr 30 '19 at 17:44
  • $E(|X-\mu|)$ is often called the [mean deviation](http://mathworld.wolfram.com/MeanDeviation.html) or the mean absolute deviation. The relationship between mean deviation and standard deviation is discussed in several questions on site. One example -- though in answer to a narrower question -- is [here](https://stats.stackexchange.com/a/70879/805); note that the result there uses the standard (convex) version of Jensen's inequality. – Glen_b Apr 30 '19 at 23:39

1 Answers1

1

By Jensen's inequality (for concave functions instead of convex functions), we have $$ E[|X|] = E[\sqrt{X^2}] \leq \sqrt{E[X^2]}. $$ Alternatively, the Cauchy-Schwarz inequality can be used to yield the same ineuqality: $$ E[|X|]^2 = E[|X|\cdot 1]^2 \leq E[|X|^2] E[1^2] = E[X^2]. $$ This makes no assumptions on the existence of a probability density or having mean zero (or any mean for that matter).

Artem Mavrin
  • 3,489
  • 1
  • 16
  • 26