I am deriving logistic regression's likelihood. I have seen two different versions:
$$\begin{equation} f(y|\beta)={\displaystyle \prod_{i=1}^{N} \frac{n_i} {y_i!(n_i-y_i)!}} \pi_{i}^{y_i}(1-\pi_i)^{n_i - y_i} \tag 1 \end{equation}$$
Or this
$$\begin{equation} L(\beta_0,\beta_1)= \displaystyle \prod_{i=1}^{N}p(x_i)^{y_i}(1-p(x_i))^{1-y_i} \tag 2 \end{equation}$$
Why is there $\frac{n_i} {y_i!(n_i-y_i)!}$ in equation 1?
Sources:
- First: https://czep.net/stat/mlelr.pdf (page 3 equ. 2)
- Second: http://www.stat.cmu.edu/~cshalizi/uADA/12/lectures/ch12.pdf (page 5 equ. 12.6)
Note: This question is not a duplicate of What does "likelihood is only defined up to a multiplicative constant of proportionality" mean in practice? One can trace the answer back to binomial distribution, after seeing how it is done. But nobody would have known the question in that post is the answer to this question.