I know that these two functions have statistical significance, but I'm finding it hard to grasp any intuition about them. I understand that in their own way they represent some type of 'average' distance an arbitrary data point has from the dataset’s mean.
In my head, if I wanted to know the average distance a point has from its dataset’s mean, I would do something like this:
$\frac{\sum|X_i - \mu|}{N}$
which literally turns out to be the mean distance from the mean of the dataset. But variance does not take an absolute value to get rid of the sign, it takes a square, which makes me lose grasp on what the actual meaning of the end number is. And standard deviation just square roots the end result of variance, mucking up the waters for me even more.
Can anyone give me some type of intuition on these two values?