2

We know that postprocessing will not increase the information. For two random variables $X$ and $Y$, $D(X||Y)>= D(f(X)||f(Y))$ for any operation $f()$ and divergence $D$. A strong data processing inequality implies $D(X||Y)> D(f(X)||f(Y))$. This includes operations $f$ such as adding a Gaussian noise. But is there a random multiplicative operation example (especially in multivariate form) such that $D(f^k(X)||f^k(Y))\rightarrow 0$ as $k$ increases. I am wondering whether some simple linear transformation such as iteratively multiplying a random Gaussian matrix satisfies this property.

develarist
  • 3,009
  • 8
  • 31
HSxiao
  • 21
  • 1
  • could you please format the equations in your question (using LaTex)? – goker Sep 24 '20 at 21:59
  • I think this can depend on the distribution of X and Y: for instance, a condition for what you are looking for to exist (there may be more), is that X and Y are not a dirac in zero, because then any multiplication by something will keep them as such, and you won't be able to have the strict inequality you are looking for. – William de Vazelhes Mar 02 '21 at 09:37

0 Answers0