Given a set of data, it seems the intuitive way to measure the deviation of the data is the following. $$ \frac{1}{n}\sum_x |x_i-\bar{x}| $$ But I understand that for theoretical reasons it is easier to work with the standard deviation formula. $$ \sqrt{ \frac{1}{n -1} \sum_x (x_i-\bar{x})^2 } $$ It can be said that $$ \sum_x |x_i-\bar{x}| \geq \sqrt{\sum_x (x_i-\bar{x})^2} $$ just as the sum of the two smaller sides of a triangle is larger than the hypotenuse.
The following can also be said. $$ \frac{1}{n} \lt \sqrt{\frac{1}{n-1}} $$ Do these two inequalities compensate for each other such that the standard deviation definition of deviation is equivalent to the intuitive definition of deviation? Or are the two definitions fundamentally different, in which case standard deviation does not correctly reflect our intuitive sense of deviation?