Under some regularity conditions we can compute fisher information as
$ - \mathbb{E}_{\theta_0} [\frac{\partial}{\partial \theta^2} \ln f(x;\theta_0)] $
I was wondering if there are some kind of inequality results including the above expecation and an expecation we take under a different true parameter (but not changing the evaluation point of the second derivative), i.e.
$ - \mathbb{E}_{\theta_1} [\frac{\partial}{\partial \theta^2} \ln f(x;\theta_0)] \overset{?}{\leq} \overset{?}{\geq} - \mathbb{E}_{\theta_0} [\frac{\partial}{\partial \theta^2} \ln f(x;\theta_0)] $
Can we say anything about the relation of these quantities?
(I did a google search already with the obvious keywords, but didn't find anything in this direction)
Edit:
Maybe interesting to note is that in exponential families with natural parametrization, that is $ f(x,\theta) = h(x)g(\theta) exp(\theta \ T(x))$ we already have that $\frac{\partial}{\partial \theta^2} \ln f(x;\theta) = -V_{\theta}[T]$, so there we have equality.
This equality does not hold, however, for the definition as the variance of the score, $ \mathbb{E} [(\frac{\partial}{\partial \theta} \ln f(x;\theta)) ^2]$.
In this case we have $\mathbb{E}_{\theta_1}[\frac{\partial}{\partial \theta} \ln f(x;\theta_0)^2] = \mathbb{E}_{\theta_1}[(T - \mathbb{E}_{\theta_0}[T])^2] \geq \mathbb{E}_{\theta_1}[(T - \mathbb{E}_{\theta_1}[T])^2] = I(\theta_1)$ , such that if $I(\theta_1) > I(\theta_0)$ we get that $\mathbb{E}_{\theta_1}[\frac{\partial}{\partial \theta} \ln f(x;\theta_0)^2] > \mathbb{E}_{\theta_0}[\frac{\partial}{\partial \theta} \ln f(x;\theta_0)^2]$.
I am still interested in more general results, and also it would be interesting how to interpret this.