I have this code here (it's out of a book called "An Introduction to Bootstrap Methods with Applications to R")
We are working with the estimator: $S_n^2=\frac{\sum_{i=1}^{n}(X_i-X_b)^2}{n}$ where $X_b = \frac{\sum_{i=1}^{n}X_i}{n}$, $S_n^2$ is the maximum likelihood estimate of $\sigma^2$. I'm relatively new to this statistics regarding likelihoods and estimators.
set.seed(5^13)
n<-25
x<-rnorm(n) # random sample of unit normal variate
varx<-var(x)*(n-1)/n # sample variance, uncorrected
c(varx, varx-1.0, -1/sqrt(n)) # sample variance and bias relative to true
# value 1.0 and expected value of bias
and I'm having trouble comprehending the last argument of the last line.
-1/sqrt(n)
Can someone tell me how he got this term that he calls "expected value of bias"?
New edited below
In the book he says that the estimator $S_n^2$ has a bias of $b=-\sigma^2/n$, which I think the -1/sqrt(n) is the expected value of this value $b$. I will have to go and research how the bias of that estimator comes about, however if you have anything to enlighten me on this, it would be greatly appreciated.
I'm familiar with with we divide by (n-1) when estimating population variance from an intuitive and mathematical perspective. He is demonstrating the importance of this by biasing the output of var().
I believe my question is even more simplistic than this, in that I cannot see the connection of -1/sqrt(n) to this. I think it is a basic math thing that I'm not seeing.