0

I have two questions, each of which I think might be related to each other but I'm not sure. Both concern the definition of variance as:

$var(x) = s_x^2 = \dfrac{1}{n-1} \sum_{i=1}^{n}(x_i - \bar{x})^2$

(1) How do we prove that $\sum_{i=1}^{n}(x_i - \bar{x}) = 0$? I can see that this is true using a few examples but I'm unsure how to do a general proof.

(2) In this definition, what is the conceptual motivation for having the squared difference squared, as opposed to cubed or some other exponent?

1 Answers1

1

Well, the first one is really easy:

Since

$\bar{x} = \frac{1}{N}\sum_i x_i$

it follows that

$\sum_i x_i = \bar{x}\space N $

so you have:

$\sum_i x_i - \sum_i \bar{x} = \bar{x}\space N - \bar{x}\space N = 0$

Regarding your second question, I think it is mostly related to analytical tractability, but I'll let others make a more informed contribution.

elelias
  • 142
  • 6
  • This is somewhat helpful however: (a) How does $\Sigma_i \bar{x} = \bar{x}N$? (b) How is it that $\Sigma_i x_i - \Sigma_i \bar{x} = \Sigma_{i=0}^{n} (x_i - \bar{x})$? – letsmakemuffinstogether Sep 01 '18 at 17:43
  • @letsmakemuffinstogether $\bar{x} = \frac{\sum_{i}{x_{i}}}{N} $ , multiply both sides by $N$ gives $N\bar{x} = \sum_{i}{x_{i}}$. $\sum_{i}{\bar{x}} = \bar{x} + \bar{x} + \dots + \bar{x}$ $N$ times. – Alexis Sep 01 '18 at 18:52
  • 1
    In English, that says: "sum N times the average value $\bar{x}$. Imagine the average is 5 and there's N=100 observations. You are summing 5, 100 times right? that's obviously 100*5. – elelias Sep 01 '18 at 19:23
  • Hi Alexis. I understand the claim you just made however if you look very closely at my question (a), you'll see that I am asking instead how it is that $\Sigma_{i}\bar{x} = \bar{x}N$, and not the more obvious question of how $N\bar{x} = \Sigma_i x_i$... Notice the difference in that there is a bar over the $x$ on both sides in my question. – letsmakemuffinstogether Sep 02 '18 at 20:50