Shannon.entropy(v, zilchtol=1e-300)
The entropy of the squares of v
is given by
sum( v^2 * log(v^2) )
.
In this implementation any zero coefficients (determined by
being less than zilchtol
) have a zero contribution to
the entropy.
The Shannon entropy measures how "evenly spread" a set of numbers is. If the size of the entries in a vector is approximately evenly spread then the Shannon entropy is large. If the vector is sparsely populated or the entries are very different then the Shannon entropy is near zero. Note that the input vectors to this function usually have their norm normalized so that diversity of coefficients corresponds to sparsity.
# # Generate some test data # # # A sparse set # Shannon.entropy(c(1,0,0,0)) 0 # # A evenly spread set # Shannon.entropy( rep( 1/ sqrt(4), 4 )) 1.386294