12

I thought that the canonical link function $g(\cdot)$ comes from the natural parameter of exponential family. Say, consider the family $$ f(y,\theta,\psi)=\exp\left\{\frac{y\theta-b(\theta)}{a(\psi)}-c(y,\psi)\right\} $$ then $\theta=\theta(\mu)$ is the canonical link function. Take Bernoulli distribution as an example, we have $$ P(Y=y)=\mu^{y}(1-\mu)^{1-y}=\exp\left\{y\log\frac{\mu}{1-\mu}+\log{(1-\mu)}\right\} $$ So, the canonical link function $$g(\mu)=\log\frac{\mu}{1-\mu}$$

But when I see this slide, it claims that $$ g'(\mu)=\frac{1}{V(\mu)} $$ Though it can be easily verified for this particular distribution (and some other distributions, like Poisson distribution), I can't see the equivalence for the general case. Can anyone give hints? Thank you~

ziyuang
  • 1,536
  • 8
  • 32

1 Answers1

14

The variance function for the Bernoulli variable is $V(\mu) = \mu(1-\mu)$. We easily check that with the canonical link $g(\mu) = \log \frac{\mu}{1-\mu} = \log \mu - \log(1-\mu)$ then $$g'(\mu) = \frac{1}{\mu} + \frac{1}{1-\mu} = \frac{1 - \mu + \mu}{\mu(1-\mu)} = \frac{1}{\mu(1-\mu)} = \frac{1}{V(\mu)}.$$

For the general case one derives from the definition that $$E(Y) = \mu = b'(\theta) \quad \text{ and } \quad \text{Var}(Y) = b''(\theta) a(\psi),$$ see e.g. page 28-29 in McCullagh and Nelder. With $g$ the canonical link we have $\theta = g(\mu) = g(b'(\theta))$, and the variance function is defined as $b''(\theta)$, which in terms of $\mu$ becomes $$V(\mu) = b''(g(\mu)).$$ By differentiation of the identity $\theta = g(b'(\theta))$ we get $$1 = g'(b'(\theta)) b''(\theta) = g'(\mu) V(\mu),$$ which gives the general relation between the canonical link function and the variance function.

In the construction of quasi-likelihood functions it is natural to start with the relation between the mean and the variance, given in terms of the variance function $V$. In this context the anti-derivative of $V(\mu)^{-1}$ can be interpreted as a generalization of the link function, see, for instance, the definition of the (log) quasi-likelihood on page 325 (formula 9.3) in McCullagh and Nelder.

NRH
  • 16,580
  • 56
  • 68
  • Thank you @NRH. Actually I know the equivalence for Bernoulli distribution. I am wondering the general case. And thanks for your reference, I'll check it :) – ziyuang Oct 20 '11 at 08:44
  • @ziyuang, the general case is now included. – NRH Oct 20 '11 at 09:46
  • 1
    @NRH - just to add to this answer, the mean and variance formulas can be derived by differentiating the equation $\int f(y,\theta,\psi)dy=1$ on both sides with respect to $\theta$ (or equivalently $\mu$). First derivative gives you the mean, second gives you the variance. – probabilityislogic Oct 20 '11 at 10:44
  • Thank you. And I've found another reference link: http://fedc.wiwi.hu-berlin.de/xplore/ebooks/html/spm/spmhtmlnode27.html – ziyuang Oct 20 '11 at 15:34