What is the derivative of the ReLU activation function defined as:
$$ \mathrm{ReLU}(x) = \mathrm{max}(0, x)$$
What about the special case where there is a discontinuity in the function at $x=0$?
What is the derivative of the ReLU activation function defined as:
$$ \mathrm{ReLU}(x) = \mathrm{max}(0, x)$$
What about the special case where there is a discontinuity in the function at $x=0$?
The derivative is:
$$ f(x)= \begin{cases} 0 & \text{if } x < 0 \\ 1 & \text{if } x > 0 \\ \end{cases} $$
And undefined in $x=0$.
The reason for it being undefined at $x=0$ is that its left- and right derivative are not equal.