If this is out of topic and going to be closed, I will appreciate to know where it is right to ask this question, as I am kind of lost right now.
I am a software engineering BSc, and recently started a deep learning position after having learned multiple MOOCs on the topic and having done some side projects.
some of those courses are
- all of Andrew ng.'s courses on coursera
- David Silver's reinforcement learning
- cs231n by Stanford
I start to find myself missing knowledge which keeps popping up, mostly in the field of information theory, statistics and probability. For example:
KL divergence
Information units
cross entropy
why are probabilities always handled as log probabilities?
and more.
I find myself going down rabbit holes in Wikipedia, trying to understand probabilistic and statistical theories, and not finding an end (or, a start).
I wish to shrink my knowledge gap by learning some basic courses I seem to be missing.
Learning from Wikipedia doesn't work for me, as I seem to be lacking some more basic understanding, only I don't know what exactly.
What are some relevant courses or topics I can learn by a syllabus?
I am hoping one or two theoretical courses would close this for me. Just don't know what to look for.