I was wondering if it is possible to update weights in logistic regression without gradient descent. If so, how?
Asked
Active
Viewed 344 times
0
-
Then I think you even don't need weights, please take a look at this [answer](http://stackoverflow.com/questions/17679140/multiple-linear-regression-with-python/34877221#34877221). – Lerner Zhang Oct 30 '17 at 01:51
-
Usually you *don't* use gradient descent for logistic regression https://stats.stackexchange.com/q/344309/35989 – Tim Mar 01 '19 at 05:04
1 Answers
2
You can use different optimizer to update the weights, but if you are asking about updating the weights without any optimizer like doing hit and trial for updating the weights, then you should not implement hit and trial method as getting the optimized weights would be difficult to get.

Harshit Mehta
- 1,133
- 12
- 15
-
Actually updating without any optimizer. Something like in this link http://iamtrask.github.io/2015/07/12/basic-python-network/ on line number 39 – Shree Ranga Raju Oct 27 '17 at 06:26
-
3That problem is a two layer neural network problem. The activation function at first layer and at the output are sigmoid function. The cost function is the error ( actual value - predicted value). it is also using gradient descent for updating the weights. – Harshit Mehta Oct 27 '17 at 07:00