Is it sound/allowed to use entropy as a measure of (non-) uniformity of a data stream?
E.g. I calculate the Shannon entropy with the standard formula based on various measures in the data stream. The measures are count values whose probabilities sum up to 1. I do the calculation continuously every 1 sec, so I get a new entropy value every second. Then I monitor the average and standard deviation of the entropy values over a sliding window, let's say the past 10 seconds. If the stddev increases or gets larger than certain threshold then I assume the stream got noiser/bumpier. Is it ok to use entropy in this context?