Can someone give an intuition of the concept 'entropy'?
I am reading maximum entropy inverse reinforcement learning and I wanted to ask what the meaning intuition of 'entropy' is.
I understand that in this learning framework, entropy is defined as the product of the probability and the log of the probability. There are some derivations going on and the resulting equation leads to the probability is directly proportional to the exponential of the reward.
I understand the maths involved, but I STILL do not understand what ENTROPY is.
Insights welcome.