2

I am stuck on the following question and I was wondering if can get some help.

Let $f(x;\theta) = g(\theta)h(x),\ a(\theta) \leqslant x \leqslant b(\theta)$ with $a(\theta)$ decreases and $b(\theta)$ increases with $\theta$ and $g(\theta)$ differentiable with random sample $X_1,...,X_n$ and $X_{(1)}<...<X_{(n)}$ the order statistics. Consider the statistic $T = max(a^{-1}(X_{(1)}), b^{-1}(X_{(n)}))$

I have to prove that the MVUE of $\theta$ is $\hat{\theta}=T-\frac{g(T)}{ng'(T)}$

I have done the completeness and sufficiency tests for T and since $\hat{\theta}$ is a one-to-one function of T, then it is also sufficient and complete by known theorems. I am now trying to calculate $E(T)$ using the following answer to post here: Find the unique MVUE

However, for this question, they were able to use the properties of the uniform to find the CDF but for my particular problem I do not believe that I should use that arguement. I was wondering if I can get some hints as to come up with the suitable CDF?

EDIT I am up to this part: \begin{equation} F(t) = P(\hat{\theta} \leqslant t) = P(a(t)\leqslant x_{(1)} \leqslant x_{(2)} \leqslant...\leqslant x_{(n)} \leqslant b(t)) \end{equation}

But again don not know how to proceed

EDIT so I believe to get the result I have to do integration by parts to get $E(T)=\theta+E\left(\frac{g(T)}{n g^{\prime}(T)}\right)$ Since $$ P_{\theta}(T \leq t)=(g(\theta))^{n}\left(\int_{a(t)}^{b(t)} h(x) d x\right)^{n} $$ and $$ f_{T}(t)=n(g(\theta))^{n}\left(\int_{a(t)}^{b(t)} h(x) d x\right)^{n-1}\left[h(b(t)) b^{\prime}(t)-h(a(t)) a^{\prime}(t)\right] \mathbf{1}_{0<t<\theta} $$ I have the following results $$ \begin{array}{c}{E(T)=\int_{a(\theta)}^{b(\theta)}\left(\operatorname{tn}(g(\theta))^{n}\left(\int_{a(t)}^{b(t)} h(x) d x\right)^{n-1}\left[h(b(t)) b^{\prime}(t)-h(a(t)) a^{\prime}(t)\right] \mathbf{1}_{0<t<\theta}\right) d t} \\ {u=\operatorname{tn}(g(\theta))^{n}\left(\int_{a(t)}^{b(t)} h(x) d x\right)^{n-1}} \\ {\quad d u=n g(\theta)^{n}\left[\int_{a(t)}^{b(t)} h(x) d x+s\left(\left(h(b(t)) b^{\prime}(t)-h(a(t)) a^{\prime}(t)\right)\right] d t\right.} \\ {v=\int_{a(\theta)}^{b(\theta)} h(b(t)) b^{\prime}(t)-h(a(t)) a^{\prime}(t) d t}\end{array} $$

but $$ v=\int_{a(\theta)}^{b(\theta)} h(b(t)) b^{\prime}(t) d t-\int_{a(\theta)}^{b(\theta} h(a(t)) a^{\prime}(t) d t $$ but doing the substitution $r=b(t), d r=b^{\prime}(t) d t$ and $s=a(t), d s=a^{\prime}(t) d t$ and $\int_{a(\theta)}^{b(\theta)} h(x) d x=\frac{1}{g(\theta)},$ v should be 0 so i am stumped. I was hoping for some hint as to create $u v=\theta$

  • The cdf$$F(x)=\int_{a(\theta)}^x h(y)\text{d}y\Big/\int_{a(\theta)}^{b(\theta)}h(y)\text{d}y\qquad a(\theta)\le x\le b(\theta)$$shows$$X_{(1)},X_{(n)}\sim \frac{n(n-1)}{2}h(x_{(1)})[F(x_{(n)})-F(x_{(1)})]^{n-2}h(x_{(n)})g(\theta)^{2}\Bbb I_{a(\theta)\le x_{(1)}\le x_{(n)}\le b(\theta)}$$with likelihood$$\frac{n(n-1)}{2}h(x_{(1)})\left(\int_{x_{(1)}}^{x_{(1)}}h(y)\text{d}y\right)^{n-2}h(x_{(n)})g(\theta)^n\Bbb I_{a(\theta)\le x_{(1)}\le x_{(n)}\le b(\theta)}\propto g(\theta)^{n}\Bbb I_{\theta\ge \max(a^{-1}(X_{(1)}), b^{-1}(X_{(n)}))}$$ and sufficient $\max(a^{-1}(X_{(1)}), b^{-1}(X_{(n)}))$. – Xi'an Apr 23 '19 at 06:21
  • Thank you for your kind help! I am truly sorry to be ignorant but to solve for $E(T)$ would i use the proposed density in this manner : $$E(T) = \int_{a(\theta)}^{b(\theta)} \max \left(a^{-1}\left(X_{(1)}\right), b^{-1}\left(X_{(n)}\right)\right) \times \frac{n(n-1)}{2} h\left(x_{(1)}\right)\left[F\left(x_{(n)}\right)-F\left(x_{(1)}\right)\right]^{n-2} h\left(x_{(n)}\right) g(\theta)^{2} \mathbb{I}_{a(\theta) \leq x_{(1)} \leq x_{(n)} \leq b(\theta)}$$ – user_1512314 Apr 23 '19 at 18:07

1 Answers1

1

Possible way to proceed:

For any $0<t<\theta$, distribution function of $T$ is

\begin{align} P_{\theta}(T\le t)&=P_{\theta}\left[a(t)\le X_{(1)},X_{(n)}\le b(t)\right] \\&=P_{\theta}\left[a(t)\le X_1,X_2,\ldots,X_n\le b(t)\right] \\&=\left[P_{\theta}(a(t)\le X_1\le b(t))\right]^n \\&=(g(\theta))^n \left(\int_{a(t)}^{b(t)}h(x)\,dx\right)^n \end{align}

Density of $T$ is therefore $$f_T(t)=n(g(\theta))^n \left(\int_{a(t)}^{b(t)}h(x)\,dx\right)^{n-1}\left[h(b(t))b'(t)-h(a(t))a'(t)\right]\mathbf1_{0<t<\theta}$$

Since you say you have verified $T$ is complete sufficient, any function of $T$ that is unbiased for $\theta$ would be the UMVUE of $\theta$.

So you might set up $E_{\theta}\left[ k(T)\right]=\theta\ldots (*)$ for some function $k$ of $T$ and solve for $k$. One way to do that is to differentiate both sides of $(*)$ with respect to $\theta$. I would leave the details to you.

Keep in mind that $$\int_{a(\theta)}^{b(\theta)} f(x;\theta)\,dx=1\implies \int_{a(\theta)}^{b(\theta)} h(x)\,dx=\frac{1}{g(\theta)}$$

I don't see how to readily verify whether the proposed estimator $T-\frac{g(T)}{ng'(T)}$ is unbiased for $\theta$.

StubbornAtom
  • 8,662
  • 1
  • 21
  • 67