2

Let $\mathbf{a}, \mathbf{b} \in \mathbb{R}^N$ be two orthogonal vectors, and $\mathbf{x} \in \mathbb{R}^N$ a random vector variable with some probability density function.

Prove or disprove that, as a consequence of the orthogonality of $\mathbf{a},\mathbf{b}$:

$$\mathrm{MI}(\mathbf{a}\cdot\mathbf{x}, \mathbf{b}\cdot\mathbf{x}) = 0$$

where MI stands for mutual information.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
becko
  • 3,298
  • 1
  • 19
  • 36

1 Answers1

3

Counter-example:

Let $\mathbf{x} = (x_1,x_2)$, $\mathbf{a} = (1,0)$ and $\mathbf{b} = (0,1)$

Then

$$\mathbf{a} \cdot \mathbf{x} = x_1$$ $$\mathbf{b} \cdot \mathbf{x} = x_2$$

and

$$\mathrm{MI}(\mathbf{a}\cdot\mathbf{x}, \mathbf{b}\cdot\mathbf{x}) = \mathrm{MI}(x_1,x_2)$$

If in addition the components $x_1$ and $x_2$ have mutual information (for instance they could be correlated) then also $\mathbf{a}\cdot\mathbf{x}$ and $\mathbf{b}\cdot\mathbf{x}$ have mutual information and $\mathrm{MI}(\mathbf{a}\cdot\mathbf{x}, \mathbf{b}\cdot\mathbf{x}) \neq 0$.

Sextus Empiricus
  • 43,080
  • 1
  • 72
  • 161