0

I had almost same question as "Why square the difference instead of taking the absolute value in standard deviation?". But I also noticed the $\sigma$ or population standard deviation is being used in the equations for Gaussian distributions. (although I could not 'perceive' that big equation). So if we have the population mean (a simpler concept) and the population standard deviation, and if we know the dataset follow a Gaussian distribution; we can directly put into the equation for a Gaussian function and can derive the curve.

equation equation from Wikipedia, Wikipedia

It makes me think of is there the answer of why we use such a squared formulae for variance and Standard deviation? If we did not use that square (instead if we put 'absolute mean deviation around mean'); the equation for Gaussian distribution probably would not work.

So my question is; is it we actually use the squared formula for variance and standard deviation so that we can easily put it into a Gaussian equation? Is my assumption correct?

PS. Instead of standard deviation, could we use absolute mean deviation for Gaussian distribution?

  • 2
    You could read Anders Hald, _A History of Parametric Statistical Inference from Bernoulli to Fisher, 1713 to 1935_, §5.5 (Laplace and his failed mean deviation attempt) and §7.2 (Gauss and the normal distribution), or Stephen M. Stigler, _The History of Statistics. The Measurement of Uncerteinty before 1900_, Chap. 3 (Laplace) and Chap. 4 (Gauss) to see when, how and why we started using standard deviation instead of absolute mean deviation. – Sergio Jul 12 '20 at 14:50
  • 1
    For closely related questions and more answers, [search our site](https://stats.stackexchange.com/search?tab=votes&q=variance%20square*%20gauss*). – whuber Jul 12 '20 at 17:11

0 Answers0