Given the range $R$ ($= max - min$) of $n$ data points ($x_1, x_2, ..., x_n$), what is the range of these points' standard deviation $\sigma$? It's easy to see that the min value of $\sigma$ can be $0$ (when all data points are of the same value), but what is the max value of $\sigma$, and why? My intuition is when the $n$ points are separated equally (e.g., $\frac{0}{n-1}R, \frac{1}{n-1}R, \frac{2}{n-1}R, ..., \frac{n-1}{n-1}R$), $\sigma$ may be the largest (which is $\sqrt{\frac{n+1}{12(n-1)}}R$ if I did my math right). But I am not sure. Anyone can prove this, or disprove this and show and prove the true maximum value of $\sigma$?
Besides (more generally), this is the question for the 1-dimensional data points. I am also curious about the result on higher dimensional data points. It is very welcome if you can suggest some ideas/proofs.