1

Let $X$ a random variable with density function $f(x)=\theta x^{\theta -1}\mathbb I_{(0,1)}(x)$, with $\theta>0$ unknown. I would like to compute the maximum likelihood estimator of $\theta$.

My idea is the following. I write the likelihood function: $$G(x_1, \cdots, x_n)=\theta^n\prod_{i=1}^nx_i \mathbb I_{(0,1)}(x_i). $$ My problem is how to deal with the indicator function. Without it I would consider the $\log G$ and I would compute its derivative to see where it is equal to $0$. Doing this I find $$\hat \theta=-n\sum_{i=1}^n\log x_i.$$

Is this correct? How can I deal with the indicator function?

@edit The maximum likelihood estimator I found, that is $\hat \theta=-n\sum_{i=1}^n\log x_i$ is not a sufficient statistics for $\theta$. Could someone telling me how I could find a sufficient statistics for $\theta$?

Thank you

user268193
  • 745
  • 5
  • 12

1 Answers1

1

Recall $$ \Bbb{I}_{(0,1)}(x_i) = \begin{cases} 1 ,& x_i \in (0,1) \\ 0 ,& \text{otherwise} \end{cases} \text{.} $$

Then \begin{align*} \prod_{i=1}^n x_i \Bbb{I}_{(0,1)}(x_i) &= \prod_{i=1}^n x_i \prod_{i=1}^n \Bbb{I}_{(0,1)}(x_i) \\ &= \begin{cases} \prod_{i=1}^n x_i ,& \text{ all the $x_i \in (0,1)$} \\ 0,& \text{otherwise} \end{cases} \end{align*}

The upshot is you get your $\hat{\theta}$ conditional on all the $x_i \in (0,1)$.

Eric Towers
  • 65,664
  • 3
  • 48
  • 115
  • Thank you very much! I was confused. Now s clear. I edited the question adding another part. It would be great if you could help me also in this second part. Thank you again! – user268193 Aug 31 '20 at 21:53
  • @user268193 : Why do you say your $\hat{\theta}$ is not sufficient? It has no dependence on $\theta$ (which is the only condition I recall for detecting a sufficient statistic). – Eric Towers Aug 31 '20 at 22:02
  • Thank you for the answer. For what I know a statistic $\hat \theta$ is sufficient if there exist two functions $F$ and $H$ such that the likelihood function $G(x_1, \cdots, x_n, \theta)=F(\hat\theta, \theta)H(x_1, \cdots,x_n)$. In this case I don't see how to construct the functions $F$ and $H$. – user268193 Aug 31 '20 at 22:09
  • @user268193 : Relative to the Fisher-Neyman factorization, $G(\vec{x}) = h(\vec{x})g(\theta,T(\vec{x}))$, the product of the indicator functions lands in the $h(\vec{x})$ and the rest lands in $g(\theta, T(\vec{x}))$. You've solved stationarity of $g$ for $\theta$ in terms of $T(x)$. It appears to me you have a sufficient statistic. (Maybe it's not minimally sufficient?) – Eric Towers Aug 31 '20 at 22:11
  • Also the product of the $x_i$ lands in $h$. Am I wrong? It should be $h(x)=\prod_i x_i^{\theta-1}\mathbb I_{(0,1)}(x_i)$ and $g(\theta, T(x))=\theta^n$? – user268193 Aug 31 '20 at 22:14
  • No, I am wrong. I cannot write $h(x)$ in that way, it depends on $\theta$. So maybe I have to use the exponential of the log to write $g$. – user268193 Aug 31 '20 at 22:16
  • @user268193 : Since you used the sum of the $x_i$ in your statistics, they have to come from $g$. This means the product of the $x_i$ lands in $g$. (Nothing from $h$ is used when you solve for stationarity to get $\hat{\theta}$.) – Eric Towers Aug 31 '20 at 22:17
  • @user268193 : So the product of the indicators lands in $h$, the power of $\theta$ lands in $g$ and the product of the $x_i$s lands in $g$. I'm still convinced you have a sufficient statistic. – Eric Towers Aug 31 '20 at 22:18
  • Yes, you are right. I was applying in a bad way the Fisher Neymann factorization. Now I understood. Thank you – user268193 Aug 31 '20 at 22:20
  • @user268193 : Glad to hear I could help! – Eric Towers Aug 31 '20 at 22:22