Your question appears to betray a confusion.
Firstly, the standard deviation is the square root of the variance, essentially by definition (see the second paragraph). So if you have the variance, you simply take its square root to get the standard deviation.
Normally you must use the mean at some point in the calculation of either the variance or the standard deviation. It's possible to compute a variance and hence, a standard deviation without ever computing the mean, but it's less efficient than using the mean.
It's possible to have one-pass calculation of the variance (and the standard deviation since we just take square roots at the end), but the "obvious formula" based on the average of the squares minus the square of the average is not numerically stable and cannot be recommended.
A stable one-pass calculation of variance is given in this answer as pseudocode.