0

In one of the Machine Learning lectures where the topic was Entropy and information, the professor explained that $Surprisal = s_y(c) = -log p(y=c)$ where $p(y=c)$ is the probability of a class given a dataset where y may belong to different classes. Also, $Entropy= E[s_y(c)]$. He went on to state that $s_y(c)$ is the number of bits needed to efficiently encode a class.

What does "number of bits to efficiently encode a class" exactly mean?

  • 1
    Does this answer your question? [What is the role of the logarithm in Shannon's entropy?](https://stats.stackexchange.com/questions/87182/what-is-the-role-of-the-logarithm-in-shannons-entropy) – mhdadk Oct 31 '21 at 21:27
  • Yes, it kind of does. Thank you! – Divyaanand Sinha Nov 01 '21 at 04:42

0 Answers0