We know that standard deviation (SD) represents the level of dispersion of a distribution. Thus a distribution with only one value (e.g., 1,1,1,1) has SD equals to zero. Similarly, such a distribution requires little information to be defined. On the other hand, a distribution with high SD requires many bits of information to be defined, therefore we can say its entropy level is high.

So my question: is SD the same as entropy?
If not, which relationship exist between these two measurements?