0

So when a model (like OLS) is efficient this means that the standard errors are accurate and hence t tests and f tests are valid. Does consistency mean the same thing?

I keep getting confused when the note jumps from consistency and efficiency. Perhaps consistency means it is both unbiased (you can make assumption from $B_1,B_2$ estimators) and efficient.

Ivan
  • 177
  • 8
  • I hope this isn't of those Self study things again lol. This is just an understanding thing. – Ivan Nov 07 '15 at 10:13
  • 1
    See e.g. [this](http://stats.stackexchange.com/questions/16381/what-is-a-complete-list-of-the-usual-assumptions-for-linear-regression/16460#16460) -- answer by mpiktas, points 3 and 4; also [this](http://stats.stackexchange.com/questions/31036/what-is-the-difference-between-a-consistent-estimator-and-an-unbiased-estimator) and [this](http://stats.stackexchange.com/questions/31260/does-efficiency-imply-unbiased-and-consistency?rq=1). Also search for "consistency" and "efficiency" (separately) on this site. Have you tried reading a textbook? These things must be explained there. – Richard Hardy Nov 07 '15 at 10:21
  • Thanks the first one at least was helpful yeah i had a look at the glossary – Ivan Nov 07 '15 at 11:44

1 Answers1

1

Consistency is sort of an asymptotic version of unbiasedness.

Unbiasedness: E(B) = B

Consistency: plim(B) = B

  • Oh you edited wol i was about to say.... What is the significance of this for Classical OLS assumptions? That if you get enough observations it becomes normal? – Ivan Nov 07 '15 at 11:45
  • Normal? In what sense? You can't demand unbiasedness from estimates on real data, the bias is always there. The only thing you can demand is consistency, which means that the bias approaches zero an the sample size approaches infinity. If your estimates are not consistent — why even bother working with them? – PhD In Procrastination Nov 07 '15 at 11:49
  • Like with the Central Limit Theorem – Ivan Nov 07 '15 at 11:50
  • Central Limit Theorem gives asymptotic normality, while Law of Large Numbers gives consistency. – Richard Hardy Nov 07 '15 at 12:33
  • What is the benefit of Asymptotic Normality? – Ivan Nov 10 '15 at 13:00
  • @Ivan, that's a new question which you could try looking up, but I'll give a short shot at it. Normality is convenient for hypothesis testing; $t$-tests and $F$-tests involving regression coefficients have their regular distributions under normality, so you can use regular critical values. By the way, use "@" to alert the user to whom the comment is addressed. (I found your comment accidentally, I was not alerted.) – Richard Hardy Nov 10 '15 at 20:25