1

If this is out of topic and going to be closed, I will appreciate to know where it is right to ask this question, as I am kind of lost right now.


I am a software engineering BSc, and recently started a deep learning position after having learned multiple MOOCs on the topic and having done some side projects.

some of those courses are

  • all of Andrew ng.'s courses on coursera
  • David Silver's reinforcement learning
  • cs231n by Stanford

I start to find myself missing knowledge which keeps popping up, mostly in the field of information theory, statistics and probability. For example:

KL divergence
Information units
cross entropy
why are probabilities always handled as log probabilities?

and more.

I find myself going down rabbit holes in Wikipedia, trying to understand probabilistic and statistical theories, and not finding an end (or, a start).


I wish to shrink my knowledge gap by learning some basic courses I seem to be missing.
Learning from Wikipedia doesn't work for me, as I seem to be lacking some more basic understanding, only I don't know what exactly.

What are some relevant courses or topics I can learn by a syllabus?

I am hoping one or two theoretical courses would close this for me. Just don't know what to look for.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
Gulzar
  • 301
  • 2
  • 12
  • 4
    I'm not sure what you mean by 'by a syllabus', but if you're willing to consider books, Information Theory, Inference and Learning Algorithms by MacKay links ML and information theory nicely, and is written at an undergraduate engineering level. "Elements of Information Theory" by Cover & Thomas is an excellent book on information theory itself at a general masters of engineering level. – Oxonon Dec 16 '20 at 12:51
  • 2
    FYI, I had a similar background to yours. 4 years into an ML PhD, I'm still down that very rabbit hole. If you're looking to do ML long term, I'd really suggest sitting down and working through a couple books back to front. Anything less and all of ML will seem like a bag of disjoint tricks. – Oxonon Dec 16 '20 at 12:54
  • @Oxonon Thanks! I will definately set time aside for those. I do know myself, however, and know that seeing lectures works better for me than books. Do you know of courses that go by any of those books maybe? by "By a syllabus" I mean something structured that wouldn't require me to stop and go learn topics from scratch, but rather would teach those topics in the correct order. The lack of order and structured background is what makes Wikipedia style learning not work for me. – Gulzar Dec 16 '20 at 12:58
  • 1
    MacKays book is made specifically with a syllabus approach in mind, at the start he highlights the order in which one can read through the chapters. I'm not familiar with online lectures. I'd however treat those as a nice easy-going overview to use ontop of detailed learning from a book, not as a substitute. I don't think (non-mathematical) lectures ever go into enough depth to learn a topic. – Oxonon Dec 16 '20 at 13:02
  • @Oxonon these are quite pricy. Mostly time-wise. If i had to choose one, which should I go for? – Gulzar Jan 09 '21 at 01:07
  • 1
    MacKays. It applies the ideas to more relevant topics for you rather than purely channel coding etc. These books should be available legally online (preprints etc). – Oxonon Jan 10 '21 at 10:52
  • I like MacKays book but for some reason I keep going back to Cover and Thomas. – Fred Guth May 12 '21 at 01:34

1 Answers1

0

Below are standard textbooks on information theory, each with lots of content and derivations of information theoretic measures

  1. MacKay, Information Theory, Inference and Learning Algorithms
  2. Cover & Thomas, Elements of Information Theory
  3. Norwich, Information, Sensation and Perception
develarist
  • 3,009
  • 8
  • 31