related to Fisher's score function has mean zero - what does that even mean?
I'm trying to follow eric zivot's maximum likelihood estimation course. I'm looking at page 12, where they compute the information for an observation:
$$I(\pi|x_i) = var(u(\pi|x_i)) = var(\frac{x_i}{\pi} - \frac{1-x_i}{1-\pi}) = var(\frac{x_i-\pi}{\pi(1-\pi)}) = \frac{var(x_i)}{\pi^2(1-\pi)^2} $$
Which I agree with. BUT then they continue and say:
$$ \frac{var(x_i)}{\pi^2(1-\pi)^2}= \frac{\pi(1-\pi)}{\pi^2(1-\pi)^2} = \frac{1}{1-\pi} $$
I think this is wrong! I think that the Bernoulli model says that there's a true parameter $\pi_0$ from which the observations are sampled, and the information matrix is:
$$ \frac{var(x_i)}{\pi^2(1-\pi)^2}= \frac{\pi_0(1-\pi_0)}{\pi^2(1-\pi)^2} $$
ONLY when evaluated at the true parameter the information evaluates to: $ \frac{1}{1-\pi}$.. the same as explained here about the expectancy of the score, only this time its variance instead of expectancy.
Am I correct? or am I missing something?