0

Can I take an average of the values I got from Shannon-Weaver index ? OR Is it wrong to take an average of some values that we get after applying natural logarithm (ln) in an equation, like Shannon-Weaver entropy ?

hyjk-stofl
  • 33
  • 4

2 Answers2

1

This question gets to an interesting issue with using the Shannon entropy in practice: due to the logarithm in the entropy formula, the entropy value you get by plugging in a set of probabilities is necessarily biased below the true population entropy value, depending on the number of classes and the number of observations. This Cross Validated page discusses this issue further and contains some links to further reading.

So if the samples you are averaging over are of different size or have different numbers of classes/probabilities, you will be averaging over samples whose entropy values have different biases, which could tend to lead to trouble. Even if all your samples have the same size and numbers of classes, you will still get an average entropy that is biased low. Depending on your application that might be acceptable, but it would be wise to examine the literature on this as you proceed, as another answer suggests.

EdM
  • 57,766
  • 7
  • 66
  • 187
0

The question is short, but the answer is potentially very long: there are several journal papers on the issue -- search for current papers on diversity partitioning and on beta diversity (Lou Jost is a good starting point). A brief indicative answer is: mean of logs is log of geometric mean. So it exists, as does exist the mean of exp(diversity).