2

Consider the linear model

$y_i = x_i' \beta+u_i$ for $i=1,\ldots,n$

with $E(y_i \mid x_i)=x_i' \beta \iff E(u_i \mid x_i)=0$. Assume that the observations on $(y_i, x_i')$ are independent over $i=1,...,n$

The textbook claims that $E(u_i \mid x_i,\ldots,x_n)=E(u_i \mid x_i)$. Why is this? How does knowing that $(y_i, x_i')$ is independent from $(y_j, x_j')$ tell us that $E(u_i \mid x_j)=0$?

jros
  • 565
  • 2
  • 15

1 Answers1

3

Independence survives functions. I.e., if random variables $X$ and $Y$ are independent, so are $f(X)$ and $g(Y)$. See this thread for a precise discussion.

And $u_i=y_i−x_i'\beta$ is a function of $(y_i,x_i)$.

Christoph Hanck
  • 25,948
  • 3
  • 57
  • 106
  • So I understand that since $u_i$ is a function of yi, then $(u_i,x_i)$ must also be independent over $i=1,...,n$, but it's not obvious to me why $(u_i,x_i)$ is then independent from $x_j$ – Jacopo Olivieri Nov 12 '21 at 13:52