I understand that the Maximum Likelihood Estimator for variance, in general, is biased (the average calculated from the sample itself reduces the degree of freedom by 1 e.t.c):
MLE <- sum((x - mean(x))^2) / n
But in single linear regression, it's assumed that the errors are independent and identically distributed as N(0, sigma^2), then the MLE for sigma^2 becomes
s^2 <- sum(error^2) / n
Is it still a biased estimator? (According to the textbook, yes; but I don't see why) Can somebody be so kind to explain this to me, please? I'm so confused.