For an arbitrary continuous random variable, say $X$, is its differential entropy always less than $\infty$? (It's ok if it's $-\infty$.) If not, what's the necessary and sufficient condition for it to be less than $\infty$?
-
1Did you try any examples? Like, uniform distribution on an interval of length $L$? – Piotr Migdal Jun 08 '15 at 09:10
-
Indeed, the differential entropy of a uniform distribution (on any finite interval) is always finite, i.e. log(L), hence bounded. In fact, I could identify 2 classes of continuous distributions whose entropy is always bounded – (1) any distribution whose support is contained in a finite interval, and (2) any distribution whose 2nd moment is finite. The former is bounded by the uniform distribution; while the latter is bounded by the Gaussian distribution. – syeh_106 Jun 09 '15 at 02:54
-
In fact, I can also construct a distribution with infinite 2nd moment and still has finite entropy. For example, consider f(x) = 3/(x^2), x>3. Clearly E[X^2] is infinite, but h(X) ~= -3.1 nats. However, I haven't been able to confirm if this is true for arbitrary continuous random variables, or come up with a counter example to refute it. I'd really appreciated it if someone can show this. – syeh_106 Jun 09 '15 at 02:55
-
Take $L\to\infty$. For entropy vs variance - yes (see Wikipedia page on differential entropy or a respective chapter of Thomas & Cover "Elements of Information Theory"). – Piotr Migdal Jun 09 '15 at 07:16
-
BTW: [Can the entropy of a random variable with countably many outcomes be infinite? - Math.SE](http://math.stackexchange.com/questions/279304/) or [this example](https://hkn.eecs.berkeley.edu/~%20calbear/research/Hinf.pdf). – Piotr Migdal Jun 09 '15 at 07:20
-
1Thank you for your comments & the links, Piotr. Incidentally, I also checked the one of my course materials and found exactly the same example of a discrete random variable with countably infinite support. Motivated by this, it's not difficult to construct a continuous analog. So the answer to the first question is evident. I'll summarize it below for other folks who may have the same question. BTW, I need to make a correction in my 2nd comment above, specifically, for f(x) = 3/(x^2), h(X) should be positive, i.e. 3.1 nats. – syeh_106 Jun 10 '15 at 01:41
-
2This question and the answer are ambiguous because they do not state over which sets the bounds are to be applied. If $X$ is an RV, then it has an entropy, period. If it is an "arbitrary" continuous RV, then (obviously) there is no upper bound possible. What constraints do you intend to impose on $X$? From the comments and your answer it appears you might want to fix the support of $X$--or maybe not? Perhaps you want to limit $X$ to those variables with given bounds on certain moments? Perhaps you want $X$ to be in a parametric family--or maybe not? Please edit this question to clarify. – whuber Jun 10 '15 at 15:50
-
You are of course right, whuber; thanks for pointing this out. I shouldn't have used the term "bounded from above"; it's confusing. Since the question was concerning the entropy of a single continuous random variable, it's just one value. So I can simply ask if it's value is less than $\infty$. (Note that I'm only concerned about the positive end; it's ok if it's $-\infty$. Note also that I did intend to ask a very general question, i.e. for an *arbitrary* continuous random variable.) – syeh_106 Jun 11 '15 at 01:49
-
The 2nd question should also be rephrased accordingly, i.e. what's the necessary *and* sufficient condition for the entropy of a continuous random variable to be less than $\infty$, or if that's difficult, what are the necessary *or* sufficient conditions? – syeh_106 Jun 11 '15 at 01:58
-
The question & answer have been edited to remove the aforementioned problem. Please let me know if there's something ambiguous. Thank you! – syeh_106 Jun 11 '15 at 02:50
-
It has been completely cleared up. I enjoyed reading your answer, too, now that I can understand it! – whuber Jun 11 '15 at 14:29
1 Answers
I thought about this question some more and managed to find a counter-example, thanks also to the Piotr's comments above. The answer to the first question is no - the differential entropy of a continuous random variable (RV) is not always less than $\infty$. For example, consider a continuous RV X whose pdf is $$f(x) = \frac{\log(2)}{x \log(x)^2}$$ for $x > 2$.
It's not hard to verify that its differential entropy is infinite. It grows quite slowly though (approx. logarithmically).
For the 2nd question, I am not aware of a simple necessary and sufficient condition. However, one partial answer is as follows. Categorize a continuous RV into one of the following 3 Types based on its support, i.e.
Type 1: a continuous RV whose support is bounded, i.e. contained in [a,b].
Type 2: a continuous RV whose support is half-bounded, i.e. contained in [a,$\infty$) or ($-\infty$,a]
Type 3: a continuous RV whose support is unbounded.
Then we have the following -
For a Type 1 RV, its entropy is always less than $\infty$, unconditionally.
For a Type 2 RV, its entropy is less than $\infty$, if its mean ($\mu$) is finite.
For a Type 3 RV, its entropy is less than $\infty$, if its variance ($\sigma^2$) is finite.
The differential entropy of a Type 1 RV is less than that of the corresponding uniform distribution, i.e. $log(b-a)$, a Type 2 RV, that of the exponential distribution, i.e. $1+log(|\mu-a|)$, and a Type 3 RV, that of the Gaussian distribution, i.e. $\frac{1}{2} log(2{\pi}e\sigma^2)$.
Note that for a Type 2 or 3 RV, the above condition is only a sufficient condition. For example, consider a Type 2 RV with $$f(x) = \frac{3}{x^2}$$ for $x > 3$. Clearly, its mean is infinite, but its entropy is 3.1 nats. Or consider a Type 3 RV with $$f(x) = \frac{9}{|x|^3}$$ for $|x| > 3$. Its variance is infinite, but its entropy is 2.6 nats. So it would be great, if someone can provide a complete or more elegant answer for this part.

- 746
- 4
- 15
-
1Great! On SE comments are not considered permanent (if relevant) should be included in the answer. The same things go for links to materials (so to either prove/show something or link to it, rather than just say). BTW: for moments, as I see any finite moment $\langle x^\alpha \rangle$ (for any $\alpha>0$) leads to bounded entropy (I just realized it, after looking ad the proof for variance). – Piotr Migdal Jun 10 '15 at 07:45
-
Thank you, Piotr, for the advices about SE policies. (Yeah, I'm obviously new here.) About finite moments leading to bounded entropy, would you share your proof? Thanks! – syeh_106 Jun 10 '15 at 09:02
-
@PiotrMigdal I plan to leave the answer to this question in its current state after adding a final touch. Motivated by Piotr's comment above, I considered if finite mean led to finite entropy. I couldn't conclude this in general. What I did find is that it was true if the support of the RV is half-bounded. Please see the revised answer above. I look forward to a better answer from someone someday. – syeh_106 Jun 13 '15 at 05:00
-
"It's not hard to verify that its differential entropy is infinite." Can you show how to verify this? It seems true for Riemann integral, but differential entropy is with respect to Lebesgue measure. I'm having trouble verifying that the corresponding Lebesgue integral does not converge. – cantorhead Apr 04 '18 at 16:52
-
@cantorhead $\int f(x)\log(1/f(x))\,dx\ge \int 1/(x\log(x))\,dx=\log\log(x)$ – syeh_106 Apr 05 '18 at 00:28
-
1Thank you. About the relationship between moments and entropy, another nice example of a Type 3 RV is $X$ with standard Cauchy (aka Lorenz) distribution. In this case the mean, $\mathrm{E}[X]$, does not exist but $H(X)=\log(4\pi)$. – cantorhead Apr 05 '18 at 02:51
-
" It grows quite slowly " That I don't understand : it "grows" with respect of what? – leonbloy Apr 27 '19 at 13:27