1

Let's say $X_1, X_2, ..., X_n$ are iid $N(\mu,1)$. We can estimate $\mu$ with $\overline{X}_n$, which is distributed as $N(\mu, 1/n)$. We can estimate $P(X \leq k)$ with $\widehat{\theta}_n = \Phi(k-\overline{X}_n)$, where $k$ is a fixed real number.

What is the variance of $\widehat{\theta}_n$?

193381
  • 369
  • 2
  • 12
  • Is this homework? – Néstor Nov 09 '14 at 07:04
  • No, my civil engineering friend asked me this for his research problem, and I felt bad that I couldn't answer on the spot. I was thinking of the probability integral transform theorem, though... – 193381 Nov 09 '14 at 07:05

1 Answers1

2

The question here focuses on estimating a probability. The proposed estimator

$$\hat P(X\leq k) = \Phi(k-\bar X_n)$$

is a consistent estimator since, $\text{plim} \bar X_n = \mu$, and by the Continuous Mapping (Mann-Wald) Theorem

$$\text{plim} \Phi(k-\bar X_n) = \Phi(k-\mu) = P(X-\mu \leq k-\mu) = P(X \leq k)$$

The standard normal CDF $\Phi$ is here treated as "just another function", since the random variable $k-\bar X_n$ does not have a standard normal distribution, but rather

$$k-\bar X_n \sim N(k-\mu, 1/n)$$

and so the transformation $\Phi(k-\bar X_n)$ does not have a uniform $U(0,1)$ distribution, although it does range in $[0,1]$.

To consider the asymptotic distribution we use the Delta Method: applying the Mean Value Theorem we have (as an exact relation)

$$\Phi(k-\bar X_n) = \Phi(k-\mu)+ \Phi'(\tilde m)\cdot(k-\bar X_n-k+\mu)$$

where $\tilde m$ is some point between $k-\bar X_n$ and $k-\mu$. Rearranging and multiplying by $\sqrt n$ we obtain

$$\sqrt n\left(\Phi(k-\bar X_n) - \Phi(k-\mu)\right) = \phi(\tilde m)\cdot\sqrt n(\mu-\bar X_n) \tag{1}$$

with $\phi$ being the standard normal PDF. We know that

$$\sqrt n(\mu-\bar X_n) \xrightarrow{d} N(0,1) $$

and also, that since $k-\bar X_n$ estimates consistently $k-\mu$, $\tilde m$ is asymptotically sandwiched towards $k-\mu$ and so, by using again the Continuous Mapping theorem, $\phi(\tilde m) \rightarrow \phi(k-\mu)$. Combining these results together with Slutsky's lemma we obtain

$$\sqrt n\left(\Phi(k-\bar X_n) - \Phi(k-\mu)\right) \xrightarrow{d} N\left(0,[\phi(k-\mu)]^2 \right) \tag{2}$$

which provides the following finite-sample approximation to the variance that is requested by the question,

$$\text{Var}\left[\Phi(k-\bar X_n)\right] \approx \frac {e^{-(k-\mu)^2}}{2\pi n}$$

Unavoidably, if we want an estimate of the variance, we will have to plug-in the obtained value of $\bar X_n =\bar x_n$ in place of $\mu$, given our choice of $k$.

Alecos Papadopoulos
  • 52,923
  • 5
  • 131
  • 241
  • In your equality $P(X_i\leq k) = \Phi\left(\frac {k - \bar X_n}{\sqrt{(n-1)/n}}\right)$, the RHS is a random variable. About the final asymptotic result, we could, I think, get it with the Delta method (which is what you heuristically do). – Stéphane Laurent Nov 09 '14 at 16:29
  • @StéphaneLaurent Indeed, thanks for the remark. I was planning some changes in this answer, I will incorporate this too. – Alecos Papadopoulos Nov 09 '14 at 19:37