18

Suppose we have the following logistic regression model:

$$\text{logit}(p) = \beta_0+\beta_{1}x_{1} + \beta_{2}x_{2}$$

Is $\beta_0$ the odds of the event when $x_1 = 0$ and $x_2=0$? In other words, it is the odds of the event when $x_1$ and $x_2$ are at the lowest levels (even if this is not 0)? For example, if $x_1$ and $x_2$ take only the values $2$ and $3$ then we cannot set them to 0.

gung - Reinstate Monica
  • 132,789
  • 81
  • 357
  • 650
logisticgu
  • 301
  • 1
  • 3
  • 5
  • 4
    I believe you will find the answer at http://stats.stackexchange.com/questions/91402 to be revealing and helpful. With minor changes, it applies directly to your situation. – whuber Apr 07 '14 at 18:15
  • 2
    @whuber: So in my example, $x_1 = 0$ and $x_2=0$ are outside my range of data? And thus $\beta_0$ and no meaningful interpretation. – logisticgu Apr 07 '14 at 18:23

3 Answers3

31

$\beta_0$ is not the odds of the event when $x_1 = x_2 = 0$, it is the log of the odds. In addition, it is the log odds only when $x_1 = x_2 = 0$, not when they are at their lowest non-zero values.

gung - Reinstate Monica
  • 132,789
  • 81
  • 357
  • 650
  • Hence $\beta_0$ has no meaningful interpretation in my situation. – logisticgu Apr 07 '14 at 18:24
  • 10
    Hence $\beta_0$ has no meaningful *independent* interpretation in your situation. That is often the case. It is still an integral part of the model. If you dropped it from the model, the rest of the model (eg, the estimate of $\hat\beta_1$) would be biased. – gung - Reinstate Monica Apr 07 '14 at 18:32
  • 4
    (+1) There are various ways you can make the intercept meaningful. For instance, if you are interested in the log odds when $x_2=2$ and $x_3=3$ then regress $p$ against $x_1-2$ and $x_3-3$. Of course you will get the same value by plugging $x_1=2$ and $x_2=3$ into the current model, giving $\beta_0+2\beta_1+3\beta_2$, but the default software output would likely automatically include a test to compare this to zero. – whuber Apr 07 '14 at 18:33
  • @gung: In a similar manner, $\exp(\beta_1)$ is comparing $x_1= 3$ to $x_1=2$ when all other variables are held constant? – logisticgu Apr 07 '14 at 19:15
  • 1
    Yes, $\exp(\beta_1)$ is the odds ratio associated w/ a 1-unit change in $x_1$ (it can be any set of values 1-unit apart) when all else is held constant. – gung - Reinstate Monica Apr 07 '14 at 19:20
5

There also might be a case when $x_1$ and $x_2$ can not be equal to $0$ at the same time. In this case $\beta_0$ does not have clear interpretation.

Otherwise $\beta_0$ has an interpretation - it shifts the log of the odds to its factual value, if no one variable can not do this.

Silverfish
  • 20,678
  • 23
  • 92
  • 180
  • Note that you can use latex typesetting here by enclosing text in dollar signs, e.g. `$x^{2}$` produces $x^2$ and `$\beta_0$` produces $\beta_0$ – Silverfish Sep 16 '16 at 21:35
1

I suggest to look at it a different way ...

In logistic regression we predict some binary class {0 or 1} by calculating the probability of likelihood, which is the actual output of $\text{logit}(p)$.

This, of course, is assuming that the log-odds can reasonably be described by a linear function -- e.g., $\beta_0 + \beta_1x_1 + \beta_2x_2+ \dotsm $

... This is a big assumption, and only sometimes holds true. If those $x_i$ components don't have independent, proportional influence on the log-odds, then best to choose another statistical framework. I.e., the log-odds is made up of some fixed component $\beta_0$, and increased incrementally by each successive term, $\beta_i x_i$.

In short, the $\beta_0$ value is the "fixed component" of that component-wise method to describe the log-odds of whatever event/condition you are trying to predict. Also remember that a regression is ultimately describing some conditional average, given a set of $x_i$ values. None of those things require that $x_i$-values be 0 in your data or even possible in reality. The $\beta_0$ simply shifts that linear expression up or down so that the variable components are most accurate.

Maybe I said the same thing in slightly different mindset, but I hope this helps ...

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
Omeed
  • 91
  • 3