2

Suppose $Y_i \overset{\text{iid}}{\sim}\text{Poisson}(X_i \lambda)$, where $X_i$ are known covariates. Give a condition on $\{X_i\}$ such that $\hat{\lambda}_{\text{MLE}}$ is a consistent estimator of $\lambda$, and give a counterexample of $\{X_i\}$ for which $\hat{\lambda}_{\text{MLE}}$ is not a consistent estimator of $\lambda$ when that condition is violated.

I have already showed that $$\hat{\lambda}_{\text{MLE}} = \dfrac{\sum_{i=1}^{n}Y_i}{\sum_{i=1}^{n}X_i}$$ and obviously $\sum_{i=1}^{n}Y_i \sim \text{Poisson}(\lambda\sum_{i=1}^{n}X_i)$.

In general, I know that the MLE is a consistent estimator, but I'm not sure how to go about showing that for this particular situation. I was initially considering dividing by $n$ for both the numerator and denominator to get $$\hat{\lambda}_{\text{MLE}} = \dfrac{\bar{Y}_n}{\bar{X}_n}$$ and we know $\bar{Y}_n \overset{p}{\to} \lambda\sum_{i=1}^{\infty}X_i$ (is this right?) by the Weak Law of Large Numbers (WLLN), so I would guess that if $0 < \sum_{i=1}^{\infty}X_i < \infty$, then $$\hat{\lambda}_{\text{MLE}}\overset{p}{\to}\dfrac{\lambda\sum_{i=1}^{\infty}X_i}{\sum_{i=1}^{\infty}X_i} = \lambda\text{.}$$ If $\sum_{i=1}^{\infty}X_i = \infty$, then I would guess the condition would be violated, though I wouldn't know how to show this.

Is my work correct? If not, how can I fix it?

Edit: I've just realized my work is definitely wrong, as $\bar{X}_n \overset{p}{\to} \sum_{i=1}^{\infty}X_i$ is obviously not true.

Clarinetist
  • 3,761
  • 3
  • 25
  • 70
  • This is surely overkill, but one way would be to calculate $E[\hat{\lambda}]$ and $\text{Var}(\hat{\lambda})$ and give an argument that a standardized $\hat{\lambda}$ converges to a standard normal in the limit, from which the required result follows easily. – Glen_b May 08 '17 at 02:14

1 Answers1

2

You need to use the fact that $$ \sum_{i=1}^{n}Y_i \sim \text{Poisson}(\lambda\sum_{i=1}^{n}X_i). $$

The estimator is clearly unbiased: \begin{align*} E[\hat{\lambda}] &= \frac{\sum_i \lambda x_i}{\sum_i x_i}\\ &= \lambda. \end{align*} And the variance vanishes \begin{align*} \text{Var}[\hat{\lambda}] &= \frac{\text{Var}(\sum_i y_i)}{\left[\sum_i x_i \right]^2} \\ &= \frac{\lambda \sum_i x_i}{\left[\sum_i x_i\right]^2 } \\ &= \frac{\lambda }{\sum_i x_i } \end{align*} if all your predictors are positive (they should be anyway) and don't shrink super fast. This follows after you apply Chebyshev's.

Poisson rvs aren't scale family, so you can't really use the standard asymptotic theorems that apply to iid rvs.

Taylor
  • 18,278
  • 2
  • 31
  • 66
  • Perhaps I'm being nitpicky, but could you please clarify something? I started reviewing this: essentially, what you have here is a **sufficient** condition for consistency of $\hat{\lambda}_{\text{MLE}}$. A **sufficient** condition for $\hat{\lambda}_{\text{MLE}}$ to be consistent for $\lambda$ is that $\sum_{i=1}^{\infty}x_i = \infty$, which I understand. But this isn't a **necessary** condition for consistency. It is true that if $\hat{\lambda}_{\text{MLE}}$ were NOT consistent, then $\sum_{i=1}^{\infty}x_i < \infty$, but lack of consistency is exactly what we're trying to show. – Clarinetist May 08 '17 at 12:14
  • What would be a condition that would imply that $\hat{\lambda}_{\text{MLE}}$ is not consistent? – Clarinetist May 08 '17 at 12:15
  • Take $x_i = i^{-2}$ then – Taylor May 08 '17 at 15:13
  • If we take $x_i = i^{-2}$, then we get a geometric series so that $\sum_{i=1}^{\infty}x_i < \infty$, and thus the asymptotic variance is nonzero. What I don't understand is that how this implies that $\hat{\lambda}_{\text{MLE}}$ isn't consistent. We have an unbiased estimator with non-zero asymptotic variance, but we see from [this answer](https://stats.stackexchange.com/a/121572/46427) that there are examples of unbiased, consistent estimators with nonzero asymptotic variance. – Clarinetist May 08 '17 at 15:18
  • @clarinetist taking the variance and showing it goes to 0 is a common way to show consistency using Chebyshevs. It's the most common because it only involves taking the variance, but it's only a sufficient condition, so you could always show it directly in case the variance doesn't go to 0. As you know the failure of the antcedent does not imply a failure of the consequent. My counter example in the comments isn't complete in the sense that I never gave you the epsilon delta proof that it isn't consistent. I leave that to you (this question does have the self study tag). – Taylor May 08 '17 at 15:38