I have read through the topic and have seen two major factors:-
- I have seen the intuition around this and sort of understand it
However, I keep seeing two other points which are mentioned here.
The error term in ordinary least squares regression is assumed to be normally distributed for the sake of inference. With a binary dependent variable, the error term won't be normally distributed. Instead, it follows the Bernoulli distribution. Can someone please explain why the error in such a case would follow a Bernoulli distribution?
Without going into the details, the assumption that the error term has constant variance also breaks. The error term becomes heteroscedastic. Can someone please explain this as well. Why would the error term become heteroscedastic?