1

In large sample theory, I'm told that as $n$ grows larger and larger ($n$ being the number of samples in a dataset) that $\sqrt n(\hat \beta_1-\beta_1)$ gets closer and closer to normal distribution. What exactly is $\sqrt n(\hat \beta_1-\beta_1)$?

Kyle
  • 1,119
  • 6
  • 13
  • 22
  • These terms appear to be taken from an answer on this site at http://stats.stackexchange.com/a/16460, where they are clearly defined. Could you please tell us what your reference is and what the stumbling blocks are for you to understand its terminology? – whuber Nov 07 '12 at 23:01

1 Answers1

0

$\sqrt n (\hat\beta_1 - \beta_1)$ is a scaled version of the difference (error) between your estimate $\hat\beta_1$ and the true value of the parameter you are trying to estimate, $\beta_1$. The unscaled error $\hat\beta_1 - \beta_1$ goes to zero as the sample size goes to infinity; when you multiply it by $\sqrt n$ it converges in distribution to a Normal (assuming that the distribution you are working with as a finite second moment).

Jonathan Christensen
  • 3,989
  • 19
  • 25