3

I just read this article and it says Mean Deviation(MD) is more efficient than Standard Deviation(SD) when there are some errors in observations. Like in real practise.

I don't know what 'efficient' means, but according to this, data mining using MD instead of SD might improve model's accuracy if training/test data has some errors. Is it alright to have this assumption?

user2652379
  • 131
  • 2
  • 1
    The "errors" they're talking about are presumably *outliers*. Absolute deviation is less pulled by extreme values than SD, which makes it less efficient w/ normal data, but more efficient w/ contaminated data. – gung - Reinstate Monica Sep 24 '15 at 04:57
  • Protection against outliers is an important criterion when choosing a statistical model to fit. Fortunately, much progress has been made on this front since Eddington discussed the merits of mean deviation in 1914. This begs the question: [why not use them?](http://stats.stackexchange.com/a/48867/603) – user603 Sep 24 '15 at 10:40

0 Answers0