6

I came across a pretty cool paper whose idea makes a lot of sense to me.

Ma, Jian, and Zengqi Sun. "Mutual information is copula entropy." Tsinghua Science & Technology 16.1 (2011): 51-54.

The gist is that the copula is the "rest" of the multivariate distribution after the marginal distributions are considered, so the relationship between the marginal variables, and mutual information also describes the relationship between the marginal variables; thus, the two must be related.

The author has an R package, copent, that implements this copula-based method for calculating mutual information. I am confused about the result of a simulation I did.

#install.packages("copent")
library(copent)
set.seed(2021)
B <- 25 # number of simulations to do
v <- rep(NA, B) # blank vector to hold simulated copula entropy values
for (i in 1:B){
  x <- runif(100) # marginal X variable
  y <- runif(100) # marginal Y variable (independent from X)
  v[i] <- copent::copent(data.frame(x, y)) # Save the copula entropy
}
summary(v)

    Min.  1st Qu.   Median     Mean  3rd Qu.     Max. 
-0.29278 -0.23628 -0.19173 -0.19102 -0.16434  0.03681  

Positive and negative entropy values? WHAT!?

(Playing around more, these negative values of entropy appear only to happen when x and y are independent.)

What's going on in their method that allows for negative entropy?

(The claim in the paper that mutual information is negative copula entropy has me mystified, too, though perhaps that's a topic for a separate question. If both entropy and mutual information are supposed to be non-negative quantities, that claim means that both are equal to zero, which only can be true for independent variables.)

Dave
  • 28,473
  • 4
  • 52
  • 104
  • 3
    Do not forget that [differential entropy](https://en.wikipedia.org/wiki/Differential_entropy) can be negative, as a typical pdf $f$ can take values above $1$. – ArnoV Feb 24 '21 at 14:01
  • @ArnoV If mutual information equals minus copula entropy, as the paper discusses, and the entropy can be positive, negative, or zero, then doesn't that force mutual information to be zero? I have posted my qualms in a separate question: https://stats.stackexchange.com/questions/511088/mutual-information-is-copula-entropy. – Dave Feb 24 '21 at 23:51

0 Answers0