0

$$\mathrm{Var}(\beta) = E [ (\beta- E\hat{\beta})^2 ] $$

My question is: Why is there "$E$" instead of summation?

Firebug
  • 15,262
  • 5
  • 60
  • 127

2 Answers2

3

$E$ stands for expectation. It required some knowledge about the pdf of your random variable. Without the pdf, you can compute an empirical estimate with a sum on your data. This can be related to the idea of ergodic processes: can you deduce the statistical properties from a sample of the process?

You might want to be careful with your notations. Very often, $\hat{\alpha}$ denotes an estimate (eg based on sums) of the quantity $E(\alpha)$. You can check for instance: Derive Variance of regression coefficient in simple linear regression.

Laurent Duval
  • 2,077
  • 1
  • 20
  • 33
2

I assume you're talking about the variance of the regression coefficient estimates (because the variance of the "true" coefficient is zero).

What you've written is not the variance of $\hat \beta$. The correct formula is

$$ E \Big( (\hat \beta - E(\hat \beta))^2 \Big) $$

what you wrote is closer to the mean squared error of $\hat \beta$:

$$ E \Big( (\beta - \hat \beta)^2 ) \Big) $$

but not quite.

not_bonferroni
  • 245
  • 1
  • 8