3

I am confused about the terminology used when discussing the logistic GLM.

When dealing with any glm, we have that for the response $Y_i$:

$$ E(Y_i) = \mu_i $$ and $$ g(\mu_i)= x_i^T \beta = \eta_i $$

where $x_i^T$ is the transpose of a vector of covariates, and $\beta$ is a vector of parameters to be estimated.

Now, in the case of the logistic, we have that $Y_i$ is the number of successes in a population of $n_i$ with probability of success $\pi_i$, I understand that this implies $Y_i \sim Bin(n_i, \pi_i)$ we are then told that $$ logit(\pi_i) = x_i^T \beta $$ that is, the link function $g(x) = logit (x)$. But this is confusing, since $\mu_i = n_i \pi_i$.

So are we modelling $E(Y_i)$ here? or are we modelling $E(Y_i)/n_i$. This is particularly annoying as I am trying to find:

$$ \frac{\partial \eta_i }{\partial \mu_i} $$

edit Maybe I should explain more why I am confused. In the text by Dobson it states that the solution of the parameter vector $\beta$ using the Newton raphson algorithm is given as :

$$ X^T W X b^{(m)} = X^TWZ $$ where this specifices the $m$-th iteration of the estimate of $\beta$. Now, $W$ is defined as a square diagonal matrix with :

$$ w_{ii} = \frac{1}{Var(Y_i)} \left( \frac{\partial \mu_i}{\partial \eta_i}\right)^2 $$

Now, according to the wiki, we have that $$ Y_i \sim Bern(\pi_i) $$

This implies that $E(Y_i) = \mu_i = \pi_i$ and since logit$(\pi_i) = \eta_i$, we have that:

$$ \frac{\partial \mu_i}{\partial \eta_i} = \pi_i(1-\pi_i) $$

we also have that $Var(Y_i) = \pi_i (1- \pi_i)$. So then, the diagonal elements of $W$ are given as:

$$ w_{ii} = \pi_i ( 1- \pi_i) $$ but in equation 21 of the following source, the author states that the diagonal elements of $W$ should in fact be:

$$ w_{ii} = n_i\pi_i ( 1- \pi_i) $$

but this contradicts that $Y_i$ should be bernoulli..doesn't it?

WeakLearner
  • 1,013
  • 1
  • 12
  • 23
  • $\pi_i=E[X_i]$ where $X_i$ is Bernoulli. – Alex R. May 27 '16 at 17:53
  • @AlexR. so $Y_i \sim Bern(\pi_i) $ and not binomial?.. – WeakLearner May 27 '16 at 23:55
  • The point of a binomial is to estimate the physical frequency of your events. Afterall, how else would you calculate $\pi_i$? You can't just calculate it from a single coin toss. – Alex R. May 28 '16 at 00:45
  • I.E. you're estimating the success probability, given covariates – Alex R. May 28 '16 at 00:46
  • @AlexR. could you be more specific? I'm asking what the distribution of $Y$ is, and what then is $\mu_i$ in the context of the GLM... – WeakLearner May 28 '16 at 06:12
  • maybe this will help: http://stats.stackexchange.com/questions/164120/interesting-logistic-regression-idea-problem-data-not-currently-in-0-1-form/164127#164127 –  May 28 '16 at 07:13
  • @fcop that does help, so then I am just confused about one thing, if $g(E(Y_i)) = \eta_i$, then are we saying that $logit(n_i \pi_i) = \eta_i$ ??? – WeakLearner May 28 '16 at 08:37
  • A comment is too short so I tried to explain it in an answer –  May 28 '16 at 09:32

1 Answers1

1

let $Y_i$ be a Binomial random variable with success probability $\pi_i$ and size $n_i$ (note that a Bernoulli random variable is just a special case where $n_i=1$).

It is well known that $E(Y_i)=n_i \pi_i$ or that $\pi_i = E(Y_i/n_i)$ (this is important in the context of your comment).

Let us further assume that $\pi_i =E(Y_i/n_i) = x_i^T \beta$.

Having observed $c_i$ successes in an experiment we know that the likelihood function is $\mathcal{L}(\beta)=\prod_i \pi(\beta)^{c_i} (1-\pi(\beta))^{(n_i - {c_i})}$

Maximising this with respect to $\beta$ and computing the Fisher information matrix you cn find the results for two cases:

  1. $\forall i, n_i=1$ (and then $c_i \in \{0,1\}$), which is the Bernoulli case
  2. $\exists i, n_i>1$ , which is the Binomial case

So, for your ''confusion'' in your comment, if you have the Binomial case you do not work with $E(Y_i)$ but with $E(Y_i/n_i)$ (see supra and thus also with $g(E(Y_i/n_i))$

  • This makes things clearer, so then it is not really correct to say that for any GLM, we have that $g(\mu_i) =$ linear function? – WeakLearner May 28 '16 at 11:16