3

Let $X_1,X_2,\ldots,X_n$ be i.i.d random variables with pdf

$$f(x\mid\theta)=\begin{cases}\frac{2(\theta-x)}{\theta^2}&,\text{ if }0<x<\theta \\ 0 &,\text{ otherwise }\end{cases}$$

What is the maximum likelihood estimator of $\theta$ ?

I know how to find MLE of uniform and exponential functions like maximising log likelihood etc. But I am unable to figure out the mle in the above case.

StubbornAtom
  • 8,662
  • 1
  • 21
  • 67
Jor_El
  • 391
  • 3
  • 9
  • The answers in these threads might help: https://stats.stackexchange.com/questions/371207/maximum-likelihood-estimate-for-a-likelihood-defined-by-parts, https://stats.stackexchange.com/questions/64102/mle-for-triangle-distribution/64103 – StubbornAtom Mar 22 '19 at 19:53
  • @StubbornAtom I am not getting a closed form of mle. – Jor_El Mar 23 '19 at 01:07
  • The likelihood function $L(\theta; x_1, x_2, x_n)$ is given by $$L(\theta) = \begin{cases}0, & \theta \leq x_\max,\\ \prod_{i=1}^n \frac{2(\theta-x_i)}{\theta^2}, & \theta > x_\max,\end{cases}$$ where $x_\max = \max_i x_i$, and thus jumps from $0$ to a nonzero value at $x_\max$. I suspect that $L(\theta)$ is a decreasing function of $\theta$ for $\theta \in (x_\max, \infty)$ so that the MLE will be $x_\max$; at least, that's the way it happens for _discrete_ random variables whose pmfs are monotone decreasing functions on $(0,\theta)$. – Dilip Sarwate Mar 23 '19 at 13:52
  • @Dilip Sarwate How can it be shown analytically that the likelihood is decreasing in $\theta $? – Jor_El Mar 23 '19 at 18:01
  • https://stats.stackexchange.com/q/345874/119261 – StubbornAtom Mar 19 '20 at 14:19
  • MLE is not the maximum. Look at a particular case: https://stats.stackexchange.com/q/317874/119261. – StubbornAtom May 07 '20 at 20:03
  • Thanks @StubbornAtom – Jor_El May 07 '20 at 20:09

0 Answers0