2

For instance, we observed $n$ tossing of a biased coin with probability of heads being $\theta$. How can I calculate this parameter via maximum likelihood? How can I derive the log-likelihood formula and the correct maximum likelihoo estimate of $\theta$?

vessilli
  • 41
  • 6
  • 1
    Given $\theta$, what is the probability of seeing $x$ out of $n$ heads? So what is the likelihood function for $\theta$ given $x$ out of $n$ heads? How would you find a $\theta$ which maximises this likelihood? – Henry May 11 '14 at 17:23
  • I have tried to find a solution by myself. But do not know if it correct or not and do not know how tho derive the log-likelihood. – vessilli May 11 '14 at 17:23
  • Give me moment, I am going to share it here. – vessilli May 11 '14 at 17:24
  • 2
    You might find [this](http://stats.stackexchange.com/questions/45043/maximum-likelihood-estimation?rq=1) answer to a similar question useful. – QuantIbex May 11 '14 at 17:29
  • 1
    We know that maximum likelihood is the value of the θ that is most well supported by the data. Let x be the number of heads and n be the number of tails, we get: L(θ,x) = θ^{x} (1-θ)^(n-x) By applying derivative, we get: d/dθ(l(θ,x)) = x/θ - (n-x)/(1-θ), And we get: (1-x/n)θ = (1-θ)x/n ; θ = x/n – vessilli May 11 '14 at 17:42
  • The best would be to add your development to your question. Mathematic expressions can be formatted using [LaTeX](http://stats.stackexchange.com/editing-help#latex). – QuantIbex May 11 '14 at 17:54
  • I guess you meant $n-x$ tails. – QuantIbex May 11 '14 at 18:11
  • I am editing it using LaTex format. We know that maximum likelihood is the value of the $\theta$ that is most well supported by the data. Let $x$ be the number of heads and $n$ be the number of tails, we get: $$L\left ( \theta ,x \right ) = \theta^x(1-\theta)^\left ( n-x\right ) $$ By applying derivative, we get: $$ \frac{\partial}{\partial\theta}L\left ( \theta,x \right ) = \frac{x}{\theta}-\frac{n-x}{1-\theta} $$ And, $$ \left ( 1-\frac{x}{n} \right )$$ So,$$\theta = \left ( 1-\theta \right )\frac{x}{n}; \theta = \frac{x}{n} $$ – vessilli May 11 '14 at 18:25
  • Again, the number of tails should be $n-x$, not $n$. Again, show your development in the question, not in comments. Also, the derivative you write is that of the *log*-likelihood $l(\theta,x)$, not the likelihood $L(\theta, x)$ – QuantIbex May 11 '14 at 20:13
  • Yes. I could discover it when the time for editing is over. – vessilli May 11 '14 at 21:01

1 Answers1

3

I've read your comment, where you write $L(\theta;x)=\theta^x(1-\theta)^{n-x}$. This is right for a single toss.

Let's say that head = 1 and tail = 0. If $\mathbf{x}_n$ is a set of $n$ tosses, the number of heads is $\sum_{i=1}^n x_i=n\bar{x}$. The likelihood function is: $$L(\theta;\mathbf{x}_n)\propto \theta^{\sum_ix_i}(1-\theta)^{n-\sum_ix_i}=\theta^{n\bar{x}}(1-\theta)^{n-n\bar{x}}$$ The log-likelihood is: $$\ell(\theta;\mathbf{x}_n)=n\bar{x}\ln\theta+(n-n\bar{x})\ln(1-\theta)$$ and: $$\ell'(\theta;\mathbf{x}_n)=\frac{n\bar{x}}{\theta}-\frac{n-n\bar{x}}{1-\theta}=\frac{n\bar{x}-n\theta}{\theta(1-\theta)}=0\quad\Rightarrow\quad\hat\theta=\bar{x}$$

Sergio
  • 5,628
  • 2
  • 11
  • 27