5

Why we define the variance of a random variable $X$ as $\text{var}[X]=\text{E}[(X-\mu)^2]$ instead of $\text{var}[X]=\text{E}[\left|X-\mu\right|]$.

Normally we understand the standard deviation $\sigma=\sqrt{\text{var}[X]}$ as a measure of the average distance of a sample from the mean. If this is the case, isn't it more reasonable to use the absolute value instead of squaring as a measure of distance? Then this way we don't have to square afterwards to obtain $\sigma$.

Ambesh
  • 3,232
  • 2
  • 23
  • 50
  • 2
    a) the way it is, large deviances have a bigger impact, b) $E[(X-\mu)^2]$ has much nicer algebraic properties; try computing $E[\lvert X + Y - (\mu_X+\mu_Y)\rvert]$. – Daniel Fischer Aug 06 '13 at 08:47
  • Related: (http://stats.stackexchange.com/questions/118/why-square-the-difference-instead-of-taking-the-absolute-value-in-standard-devia) – Adriano Aug 06 '13 at 08:49
  • 2
    See [why is it so cool to square numbers? (in terms of finding the standard deviation)](http://mathoverflow.net/q/1048/6979), [Motivation behind standard deviation?](http://math.stackexchange.com/q/4787/856), [Why is there not a simpler way to calculate the standard deviation?](http://math.stackexchange.com/q/61107/856), [How variance is defined?](http://math.stackexchange.com/q/288068/856) –  Aug 06 '13 at 08:51

0 Answers0