0

Consider two random variables $X,Y$ (possibly multi-dimensional). The mutual information is defined by:

$$ I(X,Y) = \sum_{x,y} P(x,y)\ln\left(\frac{P(x,y)}{P(x)P(y)}\right) $$

where $P(x,y)$ is the joint-density of $X,Y$ and $P(x)$, $P(y)$ the marginals.

It is often said that the mutual information $I(X,Y)$ quantifies "correlations of all orders" between $X$ and $Y$. Is there a way to understand this statement, by seeing an expansion of $I(X,Y)$ in terms of (joint) moments of $X,Y$?

becko
  • 3,298
  • 1
  • 19
  • 36
  • [Mutual information is related to the copula: Ma, Jian, and Zengqi Sun. "Mutual information is copula entropy." Tsinghua Science & Technology 16.1 (2011): 51-54.](https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=6077935) I posted two questions about this in the past year: [1](https://stats.stackexchange.com/q/510992/247274) [2](https://stats.stackexchange.com/questions/511088/mutual-information-relationship-to-copula-entropy-is-borked). – Dave Sep 07 '21 at 16:34

0 Answers0