Questions tagged [cramer-rao]
6 questions
5
votes
1 answer
How do these results show that $T(\mathbf{X})$ is an unbiased estimator of $E_\varphi[T(\mathbf{X})]$ that achieves the Cramer-Rao lower bound?
Let's say that $X_1, \dots, X_n$ has the joint distribution $f_\varphi(\mathbf{x})$ that belongs to the one-parameter exponential family
$$f_\varphi(\mathbf{x}) = \exp{\left\{ c(\varphi) T(\mathbf{x}) + d(\varphi) + s(\mathbf{x}) \right\}},$$
where…

The Pointer
- 1,064
- 13
- 35
1
vote
1 answer
Why does $T$ being an unbiased estimator for $g(\theta)$ imply that $g(\theta) = ET = \int T(\mathbf{y}) f_\theta(\mathbf{y}) \ d\mathbf{y}$?
I am currently studying the Cramer-Rao lower bound. My notes say the following:
Theorem: Cramer-Rao lower bound
Let $Y_1, \dots, Y_n$ have a joint distribution $f_\theta (\mathbf{y})$, where $f_\theta (\mathbf{y})$ satisfies the following two…

The Pointer
- 1,064
- 13
- 35
0
votes
0 answers
Connection between the Fisher information matrix and the Gaussian-weighted structure tensor
In image processing, if we call $I(x):\mathbb{R}^2\mapsto \mathbb{R}$ to the function that gives the brightness value at an image location $x=(u,v)^\top\in\mathbb{R}^2$, then the structure tensor in a neighbourhood of pixels $\Omega$ (centered…

Javier TG
- 1,068
- 1
- 5
- 17
0
votes
0 answers
Analogous information matrix and divergence for the Bhattacharyya bound
In the case of Cramér-Rao lower bound (CRLB), the Fisher information matrix (FIM) is obtained from the K-L divergence (KLD), i.e. $D(p_\theta\|p_\theta') = \int p_\theta(x)\log\frac{p_\theta(x)}{p_{\theta'}(x)}$, by taking derivative as
$$g(\theta)…

r2d2
- 153
- 1
- 6
0
votes
0 answers
CRLB derivation - Estimator indepedence of estimated param
I have followed the CRLB derivation, and I couldn't figure out why -
If f(x; θ) be a probability density with continuous parameter θ, and X1, . . . , Xn be independent random variables with density f(x; θ), and Θ(X1, . . . ,Xn) be an unbiased…

Jonathan
- 1
- 1
0
votes
1 answer
How can one show that $\bar{X}$ is the best unbiased estimator for $\lambda$ without using the Cramèr-Rao lower bound?
Assume we have the random sample $X_1, \dots, X_n$ with mean $\mu$ and variance $\sigma^2 < \infty$. We have that $E[S^2] = \sigma^2$, where $S^2 = \sum_{i = 1}^n \dfrac{(X_i - \bar{X})^2}{n - 1}$ and $\bar{X} = \sum_{i = 1}^n \dfrac{X_i}{n}$. Now…

The Pointer
- 1,064
- 13
- 35