0

If a gradient points towards a max or a min what stops gradient descent from maximizing error instead of minimizing it?

Is it the nature of the update step that makes this process one way?

  • 4
    Possible duplicate of [How can change in cost function be positive?](https://stats.stackexchange.com/questions/364360/how-can-change-in-cost-function-be-positive) see also https://stats.stackexchange.com/questions/367397/for-convex-problems-does-gradient-in-stochastic-gradient-descent-sgd-always-p/367459#367459 – Sycorax Apr 22 '19 at 17:19

1 Answers1

0

In gradient decent, we are following the negative gradient direction, where the objective function will decrease instead of increase.

Haitao Du
  • 32,885
  • 17
  • 118
  • 213