4

Is there anything inherently wrong with trying to estimate the entropy of a multidimensional random variable by first transforming it (by some method) into a single-dimensional variable?

Sycorax
  • 76,417
  • 20
  • 189
  • 313

1 Answers1

6

If $X$ is a random vector, and $Y=g(X)$, the Doob-Dynkin Lemma says that the sigma-field generated by $Y$ is contained in the sigma-field generated by $X$. This implies that $H(Y)\leq H(X)$. Hence, your idea may in general produce a guess for the entropy of $X$ that is too small.

Zen
  • 21,786
  • 3
  • 72
  • 114
  • Thanks Zen. I'm not knowledgeable enough to understand the details of the explanation, and I couldn't access the proof that the entropy of the subset sigma-field is always less than that of the superset one, but consensus indicates that your answer is good, and the final result answers my question, so I've accepted it. Again, thanks – user3557985 Jun 02 '14 at 15:36
  • @user3557985: The proof for discrete random variables is very accessible and is outlined on page 43, exercise 5, of this excellent book chapter https://web.cse.msu.edu/~cse842/Papers/CoverThomas-Ch2.pdf – Zen Jun 02 '14 at 17:02