It is said that auto-diff is very efficient in generating the derivatives for backpropagation algorithms. The why is it that some of the most widely used deep learning libraries like Theano and TensorFlow do not use this functionality? Is it because of difference implementation views or is there some drawback of auto-diff that has kept them away from it?
Asked
Active
Viewed 96 times
1
-
3Yes it does... https://stackoverflow.com/questions/36370129/does-tensorflow-use-automatic-or-symbolic-gradients – Alex R. Mar 16 '18 at 07:21