0

I am just looking for a quick check whether my reasoning is correct when calculating $E(X)$, given that $X \sim \Gamma(\alpha, \beta)$. My calculations are as follows:


\begin{align*} \text{E}(X) &= \int_0^{\infty} x \frac{1}{\Gamma(\alpha) \beta^{\alpha}} x^{\alpha - 1} e^{-\frac{x}{\beta}} dx\\ &= \alpha \beta \int_0^{\infty} \frac{1}{\alpha\Gamma(\alpha) \beta \beta^{\alpha}} x^{\alpha} e^{-\frac{x}{\beta}} dx\\ &= \alpha \beta \int_0^{\infty} \frac{1}{\Gamma(\alpha + 1) \beta^{\alpha + 1}} x^{(\alpha + 1) - 1} e^{-\frac{x}{\beta}} dx \text{ (property of gamma function)}\\ &= \alpha \beta, \end{align*}

since the integrand is the density of a $\Gamma(\alpha + 1, \beta)$-distributed random variable.


I know my answer is correct. However, the answer key in the textbook I have at hand (Introduction to Probability, by Roussas) is different, and is actually very similar to Michael Hardy's answer here (https://math.stackexchange.com/questions/1967601/expected-value-of-the-gamma-distribution).

In essence it appears that my approach is to turn the integrand into a pdf, whereas the alternate approach is to turn the integral into a value of the gamma function. My method seems very slightly simpler, but the fact that I haven't found anyone else using my approach leads me to ask: is there a flaw in my reasoning?

Thanks.

Novice
  • 531
  • 2
  • 9
  • My answer in the duplicate thread uses your method and applies it to arbitrary powers of $X,$ generalizing from $X=X^1.$ You don't have to deal with the $\beta$ because it's a scale parameter and therefore will appear in $E[X^p]$ as a factor $\beta^p.$ – whuber Feb 19 '20 at 23:44
  • 1
    I saw that thread, but I didn't realize it would be applicable to my question as well. I will study it more closely. Thanks. – Novice Feb 19 '20 at 23:49

0 Answers0