I am currently reading this RNN blog, where it talks about Backprop through time. I am struggling to derive it and don't understand how to go about such derivations in general. Stuff like this ends up being a pain point in most literature related to ML/DL/AI. Is there any resource that explains derivatives of different objects of higher dimensions in detail with proper examples.
Asked
Active
Viewed 48 times
1
-
Did you have a look at https://stats.stackexchange.com/questions/198061/why-the-sudden-fascination-with-tensors ? Contains references. – kjetil b halvorsen Sep 17 '18 at 08:58
-
[The Matrix Calculus You Need For Deep Learning](https://arxiv.org/pdf/1802.01528.pdf) – Djib2011 Sep 17 '18 at 10:16