6

Let $x$ have the probability density:

$$f(x) = \frac{x^{\alpha - 1}(1-x)^{1 - \beta}}{\mathrm{B}(\alpha,\beta)}$$

where $\alpha,\beta$ are two positive parameters and $0 \le x \le 1$ is the domain of $x$. What is the expected value of $1/x$? That is, what is the value of:

$$\langle1/x\rangle = \int_0^1 \frac{f(x)}{x} \mathrm d x$$

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
becko
  • 3,298
  • 1
  • 19
  • 36
  • 1
    1/X follows a beta prime distribution, see https://en.wikipedia.org/wiki/Beta_prime_distribution – Tim Jun 09 '17 at 07:49

2 Answers2

12

First note that the pdf of a Beta$(\alpha, \beta)$ distribution is only defined for $\alpha, \beta > 0$. Which means that for when $\alpha \leq 0$ or $\beta \leq 0$ $$\int_0^1 \dfrac{x^{\alpha - 1} (1 - x)^{1 - \beta}}{B(\alpha, \beta)} $$ is not finite. Now, \begin{align*} E\left(\dfrac{1}{X} \right) & = \int_0^1 \dfrac{1}{x} \dfrac{x^{\alpha - 1} (1 - x)^{1 - \beta}}{B(\alpha, \beta)}dx\\ & = \int_0^1 \underbrace{\dfrac{x^{(\alpha - 1) - 1} (1 - x)^{1 - \beta}}{B(\alpha-1, \beta)}}_{\text{Beta}(\alpha-1, \beta)} \dfrac{B(\alpha-1, \beta)}{B(\alpha, \beta)} dx\\ & = \dfrac{B(\alpha-1, \beta)}{B(\alpha, \beta)} \text{ if $\alpha - 1$ > 0} \,. \end{align*}

Thus when $\alpha > 1$, the expectation $E(1/X)$ is finite and is as above (this can be further simplified as per Nate Pope's comment). Otherwise, it is undefined.

Greenparker
  • 14,131
  • 3
  • 36
  • 80
  • 10
    This can be further simplified by noting that $B(a+1,b) = \frac{a}{a+b}B(a,b)$. Thus the expectation becomes $\frac{\alpha + \beta - 1}{\alpha - 1}$. – Nate Pope Jun 08 '17 at 21:39
  • 1
    +1 For a very similar trick, see [this answer](https://stats.stackexchange.com/a/139511/6633) for the calculation of $E[1/X]$ where $X$ is a Gamma random variable. – Dilip Sarwate Jun 09 '17 at 00:30
8

I want to point out another interesting solution method, which also generalizes the result to the expectation of $X^{-m}$ for integer $m=1,2,3,\dots$. I will use moment generating functions (mgf) and the results from the paper by N Cressie et.al http://amstat.tandfonline.com/doi/abs/10.1080/00031305.1981.10479334?journalCode=utas20 "The Moment-Generating Function and Negative Integer Moments".

They give the result that when $X$ is a positive random variable with mgf $M_X(t)$ which is defined in an open neighbourhood of the origin, then we have $$ \DeclareMathOperator{\E}{\mathbb{E}} \E X^{-m} = \Gamma(m)^{-1} \int_0^\infty t^{m-1} M_X(-t) \; dt $$ for positive integers $m$.

It is known that for the beta distribution, the mgf is given by a confluent hypergeometric function as $$ M_X(t) = {}_1F_1(\alpha;\alpha+\beta;t) $$ so using the result above gives that $$ \E X^{-m} = \Gamma(m)^{-1} \int_0^\infty t^{m-1} {}_1F_1(\alpha;\alpha+\beta;-t)\; dt $$ I evaluated that integral with the help of maple:

assume( a>0, b>0 );assume(m-1,posint)
GAMMA(m)^(-1) * int( t^(m-1)*hypergeom([a],[a+b],-t), t=0..infinity )
                   GAMMA(a + b) GAMMA(a - m)
                   -------------------------
                   GAMMA(a) GAMMA(a + b - m)

so finally we can write the result as $$ \E X^{-m} = \frac{\Gamma(\alpha+\beta)\Gamma(\alpha-m)}{\Gamma(\alpha)\Gamma(\alpha+\beta-m)} $$ which coinsides with the other answer for $m=1$. Then some human mathematics is needed to conclude that we need the assumption $\alpha > m$ for this to be valid.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467