I am totally confused: On the one hand you can read all kinds of explanations why you have to divide by n-1 to get an unbiased estimator for the (unknown) population variance (degrees of freedom, not defined for sample size 1 etc.) - see e.g. here or here.
On the other hand when it comes to variance estimation of a supposed normal distribution all of this doesn't seem to be true anymore. There it is said that the maximum likelihood estimator for variance includes only a division by n - see e.g. here.
Now, can anyone please enlighten me why it is true here but not there? I mean normality is what most models boil down to (not least due to the CLT). So is the choice "division by n" yet the best choice for finding the best estimation for the true population variance after all?!?