2

I am trying to figure out why the following holds:

Given $y_{i}=E[y_{i}|X_{i}]+\epsilon_{i}$ that

$E[\epsilon^{2}_{i}] =E[E[\epsilon^{2}_{i}|X_{i}]] = E[V[y_{i}|X_{i}]]$

Specifically I am trying to understand why $E[\epsilon^{2}_{i}|X_{i}] = V[y_{i}|X_{i}]$?

Clearly, I need a refresher on conditional variance and the rules for expected values....

B_Miner
  • 7,560
  • 20
  • 81
  • 144
  • 1
    I believe you will find the answer here, http://stats.stackexchange.com/questions/109735/homoskedasticity-assumption-varyx-varux-constant/109748#109748, keeping in mind that $E[\epsilon\mid X]=0$ – Alecos Papadopoulos Aug 21 '14 at 15:34

1 Answers1

4

The first equality $E[\epsilon^{2}_{i}] =E[E[\epsilon^{2}_{i}|X_{i}]]$ is just the law of total expectation.

Recall that the variance of $X$ is the expected squared deviance of $X$ from its expected value: \begin{equation} \textrm{Var}(X) = \mathbb{E}\left( \left(X - \mathbb{E}(X)\right)^2 \right). \end{equation} The conditional variance is defined similarly, but now both expectations are conditional: \begin{equation} \textrm{Var}(X \mid Y) = \mathbb{E}\left[\left( X - \mathbb{E}(X \mid Y)\right)^2 \mid Y\right]. \end{equation}

The second equality $E[\epsilon^{2}_{i}|X_{i}] = V[y_{i}|X_{i}]$ is obtained by

  1. Applying the aforementioned definition of conditional variance to $\textrm{Var}[y_i \mid X_i]$
  2. Use the given part to express this in terms of $\epsilon_i$.
Juho Kokkala
  • 7,463
  • 4
  • 27
  • 46