I was studying neural networks and activation functions. It crossed my mind that what if ReLU non-linearity is learned by a neural network itself. If it is possible, How?
Asked
Active
Viewed 53 times
1
-
To be able to represent non-linear relationships (such as the ReLU function), the neural network needs to contain non-linear activation functions to begin with. Since ReLU is about the simplest non-linearity one can imagine, I can't see the point of approximating it with more complicated ones. Anyway, you open an interesting question on theoretical NN capabilities. It might help specifying more details about the scenario you have in mind, though, otherwise the answers will be rather vague. – Jan Kukacka Jan 21 '20 at 09:49
1 Answers
1
Yes. ReLU is a nice well-behaved function and neural networks are universal function approximaters, so they can certainly learn ReLU.
For example: https://arxiv.org/abs/1906.09529

robsmith11
- 172
- 9