I'm working with Shannon, Tsallis and Rényi entropies. I need to normalize these entropies for comparison purposes. In Shannon's entropy you need only to divide by the log of the number of bins.
$$H(X) = -\sum_{i}\left({P(x_i) \log_b P(x_i)}\right)/\log_b(N)$$
where $N$ is the number of bins and $b$ the log-base (in Shannon is equal 2).
Edit: Also for Rényi it is $\log(N)$
I'm missing Tsallis.