The case is as follows:
Suppose that
import numpy as np
X = np.array([1, 1, 1])
y = np.array([1, 0, 1])
Then I perform a logistic regression with no intercept to check out the fitted coefficient:
from sklearn.linear_model import LogisticRegression
def fit_predict(X, y, fit_intercept=True):
model = LogisticRegression(fit_intercept=fit_intercept)
model.fit(X.reshape(-1, 1), y)
print(f'model coefficients: {model.coef_}')
fit_predict(X, y, fit_intercept=False)
# output: [[0.2865409]]
I am pretty confused by this output. According to my algebra (directly solving the optimization constraint), the coefficient should be $logit(2/3) \approx 0.6931471805599452$.
Is this because my math is wrong, or because something else is going on that I don't know about?
The algebra is as follows, starting with the following equation:
$$ \sum_i y_i \cdot x_i - sigmoid(x_i) \cdot x_i = 0$$
If we plug the values in, then $$2 = 3\cdot sigmoid(1)$$.
I conclude that $\beta = logit(2/3)$.
Thanks in advance.