0

This seem to silly but I wanted to confirm if the derivative of the log-likelihood $\hskip 2 pt l(x_i)$. The derivative of $$\frac{d (\sum_{i=1}^{M} log(x_i))}{dx} = \frac{1}{x_i} \sum_{i=1}^{M} \frac{1}{x_i}$$ Is this correct?

UPDATE of the Question : the pdf is $$p(x) = dP(x)/dx = d*x^{d-1}$$ where $P(x) = x^d$ $0<x<1$

Then the log-likelihood of x is $$ln L(x) = ln[p(x_1).p(x_2).\ldots.p(x_M)] = M*ln(d) + (d-1)\sum_{i=1}^{M} log(x_i)$$

Taking derivative of $ln(x)$ w.r.t to x or $x_i$ so as to find an estimate of $x$. The purpose is to find the minimum x's and the minimum value where found would correspond to the parameter. How do I proceed?

Drew75
  • 1,115
  • 9
  • 12
SKM
  • 737
  • 1
  • 7
  • 21
  • 1
    shouldn't it be the sum of $1/x_i$? – Drew75 Dec 03 '13 at 20:36
  • yes, it should. The best way to see it is $\frac{d}{dx}\sum_i \log x_i=\sum_i \frac{d}{dx}\log x_i=\sum_i\frac{1}{x_i}$. – fabee Dec 03 '13 at 20:41
  • @Drew75:Sorry, for the incomplete question, Ii have updated it. please have a look – SKM Dec 03 '13 at 20:43
  • @Drew75 is right. Your derivative does not make sense because the $x_i$ is in front of the sum. But if you compute a log-likelihood, shouldn't it be $\frac{d}{dx}\sum_i \log p(x_i)$? – fabee Dec 03 '13 at 21:33
  • 1
    The derivative is **zero,** because the right hand side does not include any $x$, which is the variable with respect to which you are differentiating. If the $x_i$ are assumed to be functions of $x$, then this derivative is incorrect because it does not account for the $d(x_i)/dx.$ – whuber Dec 03 '13 at 21:36
  • from what likelihood does this come? – Elvis Dec 03 '13 at 21:58
  • @Elvis: I have updated the question, kindly have a look. – SKM Dec 03 '13 at 22:25
  • @whuber:I have posted the full problem with the likelihood function. Are the steps all-right? – SKM Dec 03 '13 at 22:26
  • 1
    The edit appears to confuse "$x$" with the parameter "$d$". As such it is nonsensical. Take a look at related questions on our site, such as http://stats.stackexchange.com/questions/32103, which illustrate the procedure generally, or http://stats.stackexchange.com/questions/4052, which examines a specific probability model. – whuber Dec 03 '13 at 22:26

1 Answers1

2

If I understand correctly, the density is $$ f(x) = \begin{cases} d x^{d-1} & \text{if } 0 \le x \le 1 \\ 0 & \text{else} \end{cases}$$

The likelihood of $d$ is a function of the parameter $d$, the observations $x_1, \dots, x_n$ being considered as fixed: $$L(d) = \prod_{i=1}^n d x_i^{d-1} = d^n\prod_{i=1}^nx_i^{d-1} .$$ The log-likelihood is $\ell(d) = n \log d + (d-1) \sum_{i=1}^n \log(x_i)$.

It makes no sense to try to derive $\ell(d)$ with respect to the $x_i$. To find the MLE you have to derive it with respect to $d$: $$ {\partial \over \partial d} \ell(d) = n \times {1\over d} + \sum_{i=1}^n \log(x_i),$$ which leads to $$ \widehat{d} = - {n} \left( \sum_{i=1}^n \log(x_i) \right)^{-1}.$$

Elvis
  • 11,870
  • 36
  • 56
  • Thank you for your reply. Could you please let me know if the extension of the above is possible for this application : Considering xi to be the values of distances between 2 different signals - received signal X with actual parameters a,b & Y is the output of inverse filter. Distances are ri = ||Xi -Yi||.Instead of using the raw distances, using the pdf of the distances from above for both signals, how can I apply MLE on D=integration(f(xn|a,b))-f(yn|a1,b1))^2 dy, so as to maximize the probability of finding the D ? – SKM Dec 04 '13 at 18:53
  • I am sorry, I don’t know anything about signal filtering and stuff. You should open a new question, describing in details your data, your problem and the rationale of your solution. – Elvis Dec 04 '13 at 20:04
  • Thank you, but can you say from the pdf above, can I get an optimal value of x by applying MLE of Expectation Maximization? – SKM Dec 05 '13 at 01:52
  • It seems like the estimator for $d$ is inverted in the final line. The estimator is actually for $1/d$. – Drew75 Dec 05 '13 at 07:51