It depends on how you want to calculate variance. Remember that you're doing inferential statistics, so there is some unknown true variance of the population distribution, and we perform calculations to try to estimate that true value.
There are many ways to calculate estimators. A common method is called maximum likelihood estimation (MLE). Using that approach, we get that $\hat{\sigma}_{MLE}^2 = \dfrac{\sum_{i=1}^n(x_i - \bar{x})^2}{n}$. Since this so closely resembles the variance calculation for a population (the average of the squared deviations from the mean), this is sometimes called the population variance formula.
However, we get this annoying feature of $\hat{\sigma}_{MLE}^2$ that its expected value (average) is not the population parameter $\sigma^2$. To correct for that, we divide by $n-1$ instead of just $n$, and then the expected value of $\hat{\sigma}^2 = \dfrac{\sum_{i=1}^n(x_i - \bar{x})^2}{n-1}$ is the population parameter $\sigma^2$.
In statistical terminology, the $\hat{\sigma}_{MLE}^2$ estimator is biased while $\hat{\sigma}^2$ is unbiased. All else equal, we would prefer an unbiased estimator.
Amazingly, when you take the square root of either estimator, you get a biased estimator of the standard deviation. There is not a formula for an unbiased estimator of standard deviation that works in general, though it has been worked out for a normal distribution and probably some other common distributions, but it would be a different formula for a normal population than an exponential population.
As a heads up, when people don't specify which estimators they are using, they are assumed to be using the unbiased estimator for variance and its square root for standard deviation (unless you know your field to deviate from this).