4

The random variable $Y$ is defined by $Y=\log(1+aX)$ where $X$ has an $F(m,m)$ distribution and $a$ is a non-negative constant. What is the expectation of $Y$?

Kenn Tie
  • 65
  • 4
  • 1
    It would be much simpler to note that if $X \sim F(2,2)$, then its pdf is $f(x) = \frac{1}{(1+x)^2}$ – wolfies Dec 03 '18 at 15:54
  • 1
    X has an F(m,m) distribution.not F(2,2) – Kenn Tie Dec 04 '18 at 03:58
  • 1
    I can't find the formula for this integral,when $X$ has an $F(m,m)$ distribution – Kenn Tie Dec 04 '18 at 04:07
  • 1
    For arbitrary degrees of freedom $a$ and $b,$ there is a relatively simple formula in terms of $\csc,$ $\Gamma$, and the logarithmic derivative of $\Gamma$ (*aka* $\psi,$ the "polygamma" function). It becomes indeterminate whenever either $2a$ or $2b$ is an integer, but can be evaluated with one or two applications of L'Hopital's Rule. – whuber Dec 04 '18 at 16:13

1 Answers1

1

Let $X,Y$ be two independent $\chi^2_m$ random variables. We can write $F = X/Y \sim F_{m,m}.$ Note: you call my $F$ by $X$.

Notice that $$ E\left[\log(1 + aF)\right] = E\left[\log\left(\frac{aX + Y}{Y}\right)\right] = E[\log(aX + Y)] - E[\log(Y)]. $$ This might be easier to evaluate. $E[\log(Y)]$ has a formula you can look up.

If $a=1$, you can use the same formula for the other part. If not, then general linear combinations of Gamma random variables have a lesser-known distribution (see this and this). Perhaps there is some result one can dig up about expectations of the logarithm of these types of random variables. Or we could try integration.

Taylor
  • 18,278
  • 2
  • 31
  • 66
  • [this](https://stats.stackexchange.com/questions/72479/generic-sum-of-gamma-random-variables) can't find the closed-form solution .It's an approximation.[this](https://stats.stackexchange.com/questions/2035/the-distribution-of-the-linear-combination-of-gamma-random-variables) The PDF form in this one cannot be integrated. – Kenn Tie Dec 05 '18 at 10:46