0

The problem is stated as:

Let $f$ denote the density function of the random variable $X$. $X$ has a symmetric distribution around $a$, in other words, $f(a+h) = f(a-h)$. Prove that $E(X) = a$, provided it exists.

I understand from the general concept that because $X$ is symmetric, the mean--or $E(X)$--should be at $a$, but I am not sure what axioms to use to prove this.

1 Answers1

1

With some algebra, we have:

$$E[X]=\int_{-\infty}^{\infty} xf(x)dx=\int_{-\infty}^{a}xf(x)dx+\int_{a}^{\infty}xf(x)dx=\int_{-\infty}^{0}(y+a)f(y+a)dy+\int_{0}^{\infty}(y+a)f(y+a)dy$$ First integral can be written as $\int_0^{\infty}(a-y)f(a-y)dy=\int_0^{\infty}(a-y)f(y+a)dy$ due to symmetry around $a$. When we sum this with the second integral, we get $\int_0^{\infty}2af(y+a)dy=2a\int_a^{\infty}f(x)dx=2a\frac{1}{2}=a$.

gunes
  • 49,700
  • 3
  • 39
  • 75
  • What exactly was the purpose of initially setting the bounds of the integrals to a, then switching them to 0? Was it to enable the substitution of the integrals so that they could be added and simplified? – Crystal McMillian Jan 18 '19 at 07:53
  • 1
    There are probably other ways to do it, but my aim was to negate the integral and sum the two since they have the same bounds after the negation, by exploiting the symmetry. – gunes Jan 18 '19 at 07:55