Questions tagged [point-estimation]

Point estimation is the application of an estimator to the data in order to learn about a certain population parameter.

Point estimation is the application of an estimator (a statistic) to the data in order to learn about a certain population parameter. This parameter may be a fixed number as seen by classical or frequentist statistics or it may be a random variable itself as in Bayesian statistics. Common methods of point estimation are ordinary least squares, maximum likelihood, the method of moments or Bayes' estimators.

For further information see:
Lehman, E.L. and Casella, G. (1998) "Theory of Point Estimation", 2nd edition, Springer Verlag Inc., New York

206 questions
38
votes
3 answers

What percentage of a population needs a test in order to estimate prevalence of a disease? Say, COVID-19

A group of us got to discussing what percentage of a population needs to be tested for COVID-19 in order to estimate the true prevalence of the disease. It got complicated, and we ended the night (over zoom) arguing about signal detection and…
34
votes
3 answers

Is p-value a point estimate?

Since one can calculate confidence intervals for p-values and since the opposite of interval estimation is point estimation: Is p-value a point estimate?
31
votes
4 answers

How to derive the likelihood function for binomial distribution for parameter estimation?

According to Miller and Freund's Probability and Statistics for Engineers, 8ed (pp.217-218), the likelihood function to be maximised for binomial distribution (Bernoulli trials) is given as $L(p) = \prod_{i=1}^np^{x_i}(1-p)^{1-x_i}$ How to arrive at…
22
votes
2 answers

Shrunken $r$ vs unbiased $r$: estimators of $\rho$

There has been some confusion in my head about two types of estimators of the population value of Pearson correlation coefficient. A. Fisher (1915) showed that for bivariate normal population empirical $r$ is a negatively biased estimator of $\rho$,…
ttnphns
  • 51,648
  • 40
  • 253
  • 462
19
votes
5 answers

Extract data points from moving average?

Is it possible to extract data points from moving average data? In other words, if a set of data only has simple moving averages of the previous 30 points, is it possible to extract the original data points? If so, how?
user16679
18
votes
2 answers

Is the theory of minimum variance unbiased estimation overemphasized in graduate school?

Recently I was very embarrassed when I gave an off the cuff answer about minimum variance unbiased estimates for parameters of a uniform distribution that was completely wrong. Fortunately I was immediately corrected by cardinal and Henry with Henry…
Michael R. Chernick
  • 39,640
  • 28
  • 74
  • 143
16
votes
4 answers

Inference for the skeptical (but not math-averse) reader

I just watched a lecture on statistical inference ("comparing proportions and means"), part of an intro to stats online course. The material made as little sense to me as it always does (by now I must have seen this stuff dozens of times, spread…
kjo
  • 1,817
  • 1
  • 16
  • 24
14
votes
2 answers

Invariance property of MLE: what is the MLE of $\theta^2$ of normal, $\bar{X}^2$?

Invariance property of MLE: if $\hat{\theta}$ is the MLE of $\theta$, then for any function $f(\theta)$, the MLE of $f(\theta)$ is $f(\hat{\theta})$. Also, $f$ must be a one-to-one function. The book says, "For example, to estimate ${\theta}^2$,…
user13985
  • 836
  • 4
  • 12
  • 20
11
votes
1 answer

Sufficiency or Insufficiency

Consider a random sample $\{X_1,X_2,X_3\}$ where $X_i$ are i.i.d. $Bernoulli(p)$ random variables where $p\in(0,1)$. Check if $T(X)=X_1+2X_2+X_3$ is a sufficient statistic for $p$. Firstly, how can we find the distribution for…
Landon Carter
  • 1,295
  • 11
  • 21
10
votes
1 answer

Posterior variance vs variance of the posterior mean

This question is about the frequentist properties of Bayesian methods. Suppose we have data ${\bf y}$ generated from a distribution with a single parameter $\theta$, equipped with a prior $\pi(\theta)$. This leads to a posterior distribution…
10
votes
2 answers

Why is the geometric median called the $L_1$ estimator?

My question is simply, why is the geometric median called the $L_1$ estimator? This always reminds of $L_p$ spaces but the distance being minimized in the geometric median's definition isn't $L_1$ but rather the $L_2$ (Euclidean) norm. What does the…
Fixed Point
  • 569
  • 1
  • 6
  • 15
9
votes
1 answer

Determine an unknown number of real world locations from GPS-based reports

I'm working on some software which should determine real world locations (f.e. speed cams) from several GPS-based reports. An user will be driving when reporting a location, thus the reports a very inaccurate. To solve that problem I have to cluster…
9
votes
1 answer

When can't Cramer-Rao lower bound be reached?

The Cramer-Rao lower bound (CRLB) gives the minimum variance of an unbiased estimator. One sentence in the wiki page says "However, in some cases, no unbiased technique exists which achieves the bound. This may occur either if for any unbiased…
9
votes
3 answers

Does a Bayes estimator require that the true parameter is a possible variate of the prior?

This might be a bit of a philosophical question, but here we go: In decision theory, the risk of a Bayes estimator $\hat\theta(x)$ for $\theta\in\Theta$ is defined with respect to a prior distribution $\pi$ on $\Theta$. Now, on the one hand, for…
user32849
  • 385
  • 2
  • 8
9
votes
2 answers

Rao-Blackwellization of Gibbs Sampler

I am currently estimating a stochastic volatility model with Markov Chain Monte Carlo methods. Thereby, I am implementing Gibbs and Metropolis sampling methods.Assuming I take the mean of the posterior distribution rather than a random sample from…
1
2 3
13 14