25

We know that standard deviation (SD) represents the level of dispersion of a distribution. Thus a distribution with only one value (e.g., 1,1,1,1) has SD equals to zero. Similarly, such a distribution requires little information to be defined. On the other hand, a distribution with high SD requires many bits of information to be defined, therefore we can say its entropy level is high.

http://upload.wikimedia.org/wikipedia/commons/thumb/f/f9/Comparison_standard_deviations.svg/612px-Comparison_standard_deviations.svg.png

So my question: is SD the same as entropy?

If not, which relationship exist between these two measurements?

Edgar Derby
  • 479
  • 1
  • 5
  • 10
  • 2
    see https://en.wikipedia.org/wiki/Entropic_uncertainty section "Entropy versus variance bounds" – lowtech Mar 27 '19 at 03:07

3 Answers3

29

They are not the same. If you have a bimodal distribution with two peaks and allow the spacing between them to vary, the standard deviation would increase as the distance between the peaks increases. However, the entropy $$H(f) = -\int f(x) \log f(x) dx$$ doesn't care about where the peaks are, so the entropy would be the same.

Nick Alger
  • 18,009
  • 11
  • 65
  • 89
13

More counterexamples:

  1. Let X take be a discrete random variable taking two values $(-a,a)$ with equal probability. Then the variance $\sigma_X^2=a^2$ increases with $a$, but the entropy is constant $H(X)=1$ bit.

  2. Let $X$ be a discrete rv taking values on $1 \cdots N$ with some arbitrary non-uniform distribution $p(X)$. If we permute the values of $p(X)$, the variance will change (decrease if we move the larger values towards the center), but the entropy is constant.

  3. Let $X$ be a continuous rv with uniform distribution on the interval $[-1,1]$ $p(X)=1/2$. Let modify it so that its probability (on the same support) is bigger towards the extremes : say, $p(Y)=|Y|$. Then $\sigma^2_Y > \sigma_X^2$ but $H(Y)< H(X)$ (the uniform distribution maximes the entropy for a fixed compact support).

leonbloy
  • 60,243
  • 9
  • 68
  • 146
10

Entropy and Standard Deviation are certainly not the same, but Entropy in most cases (if not all) depends on the Standard Deviation of the distribution. Two examples:

For the Exponential distribution with density function $$\lambda e^{-\lambda x},\;\; x\ge 0,\, SD=1/\lambda$$we have

$$H(X) = 1-\ln\lambda = 1+\ln SD$$

So as SD increases, so is (here differential) Entropy.

For the Normal distribution, with density function $$\frac{1}{\sigma\sqrt{2\pi}}\, e^{-\frac{(x - \mu)^2}{2 \sigma^2}}, \;\; SD = \sigma$$ we have

$$H(X) = \frac12 \ln(2 \pi e \, \sigma^2) = \frac12 \ln(2 \pi e) +\ln SD $$ so again differential Entropy increases with SD.

(Note that differential Entropy can be negative).

Alecos Papadopoulos
  • 10,148
  • 1
  • 25
  • 45