0

Considering the vector $\textbf{z} \sim \mathcal{CN}(\textbf{0}_{M},\Theta_{M \times M})$, what would be the expectation of $\frac{1}{\textbf{z} \textbf{z}^{H}}$, i.e.,

$\mathbb{E} \left\lbrace \frac{1}{\textbf{z} \textbf{z}^{H}} \right\rbrace = ?$

where $\Theta_{M \times M}$ is a Hermitian matrix, which is not a diagonal matrix nor an identity matrix. In other words, all the off diagonal elements are different from zero and the diagonal elements are all different.

Gordon Smyth
  • 8,964
  • 1
  • 25
  • 43
  • Hermitian matrices are diagonalizable with real eigenvalues. That simplifies the question somewhat. A special case that is easy to handle occurs when all eigenvalues are equal. Otherwise, explicit formulas are hard to come by and slightly more delicate analyses is needed. Are you considering that special case or not? – whuber Feb 03 '18 at 13:25
  • This problem arises from telecommunications, more specifically Multiple-Input Multiple-Output (MIMO) channels, where the square of the eigenvalues say how strong the MIMO parallel channels are. By assuming that the eigenvalues are equal we are saying that the channels have the gain/power, which is not true in real environments. If you want you can consider the special case, lets see. Thanks! – Felipe Augusto de Figueiredo Feb 03 '18 at 16:41
  • In the special case the distribution is proportional to an [Inverse Gamma](https://en.wikipedia.org/wiki/Inverse-gamma_distribution). In the general case it will be distributed as the reciprocal of a sum of Exponential (*i.e.*, Gamma(1)) distributions of different shape parameters. To get a sense of what is involved in working that out, see https://stats.stackexchange.com/questions/72479. – whuber Feb 03 '18 at 17:11
  • I see. How could the expectation be found in the general case? Thanks – Felipe Augusto de Figueiredo Feb 03 '18 at 18:26
  • The "inverse Gaussian distribution" is not the inverse of a Gaussian random variable, so I've removed that tag. – Gordon Smyth Feb 04 '18 at 16:24

0 Answers0