2

Let the density function be given by

$$ f(x;a,b) = \frac{a + 2 b g(x) + (1-a-b) g(x)^2}{(1-x)(2 b g(x) + (1-a-b) g(x)^2)}$$

where $a$ and $b$ are parameters of interest and $g(x)$ is a known function.

I was told that using this density function in maximum likelihood, the parameters $a$ and $b$ are identified.

The concept of identification is clear to me, but what are the rigorous mathematical considerations to conclude identifiability here?

Lucas Farias
  • 1,232
  • 1
  • 8
  • 22
bonifaz
  • 875
  • 1
  • 6
  • 17
  • 1
    You should also check this question for a way more detailed answer: https://stats.stackexchange.com/questions/20608/what-is-model-identifiability?rq=1 – Lucas Farias May 25 '17 at 00:48

2 Answers2

4

In Chapter of 14 [p. 2] of Greene's book it's stated that for a likelihood function:

The parameter vector $\theta$ is identified (estimable) if for any other parameter vector, $\theta^* \neq \theta$, for some data $y$, $L(\theta^∗|y)\neq L(\theta |y)$.

Based on that, it's not hard to check that for different values of $a$ and $b$ you'll have different values for the likelihood of the given density.

EDIT: When I say different values of $f$ for different values of $a$ and $b$ I should actually say the likelihood function must be a one-to-one correspondence (or bijective, if you prefer).

Lucas Farias
  • 1,232
  • 1
  • 8
  • 22
  • Starting with $f(x; a, b) = f(x; a', b')$, I obtain $a'(-2b + g(x) (b-1)) + a(2b' + g(x)(1-b))$, and then by comparing coefficients this implies that $a = a'$ and $b = b'$. Is that what you had in mind? – bonifaz May 25 '17 at 18:48
  • @bonifaz Exactly. – Lucas Farias May 25 '17 at 19:04
2

I think a reasonable approach to a rigorous proof would be to assume the contrapositive -- that is, assume there exists some $a'$ and some $b'$ such that $$ f(x;a',b') = f(x;a,b)$$ and demonstrate based on this assumption that $a=a'$ and $b=b'$.

dlid
  • 558
  • 4
  • 18