2

I noticed that if $X$ is a RV in $[0,1]$ then $V[X] \leq E[X](1-E[X])$, which also implies that the bernoulli distribution maximizes variance (one of many solutions).

For interest's sake consider the discrete case: let $p_i \geq 0$ give probability of $1 \geq x_i \geq 0$, we assume: $V[X] > E[X](1-E[X])$

\begin{align} E[X^2] - E[X]^2 & > E[X](1-E[X]) \\ \sum_i x_i^2p_i & > \sum_i x_ip_i\\ \sum_i (x_i-1)x_ip_i & > 0 \end{align}

Which is a contradiction since $(x_i-1) \leq 0, x_i \geq 0, p_i \geq 0$.

Anyway, this is a rather handy fact, does this inequality have a name, generalization, or further extensions?

fairidox
  • 1,188
  • 5
  • 16
  • jensen inequality? – Simone May 07 '13 at 06:20
  • I don't see a relationship to Jensen's.. care to elaborate? – fairidox May 07 '13 at 06:42
  • Sorry, my bad. I didn't read it carefully. I just found this, they use this fact to exploit other upper bounds: stat.fsu.edu/techreports/M807.pdf – Simone May 07 '13 at 10:30
  • I don't know if the inequality $\operatorname{var}(X) \leq E[X](1-E[X])$ for random variables taking on values in $[0,1]$ has a name or not, but a generalized version (for random variables taking on values in $[0,c]$) can be found [here](http://stats.stackexchange.com/a/18593/6633). See also [this question](http://stats.stackexchange.com/q/18621/6633) and the answers to it. – Dilip Sarwate May 07 '13 at 12:55
  • 1
    There is a much, much simpler proof which is fully general: since $V[X]+E[X]^2$ = $E[X^2]$ and $|X|\le 1$, it is obvious that $E[X^2]\le E[1\cdot X] = E[X]$ and the result follows upon subtracting $E[X]^2$. (This is a trivial instance of [Holder's Inequality](http://en.wikipedia.org/wiki/H%C3%B6lder%27s_inequality), *inter alia*.) It can also be seen as an application of the Cauchy-Schwarz Inequality. – whuber May 07 '13 at 13:49

0 Answers0