Groups
Category
Shannon entropy quantifies the average uncertainty or information content of a random variable in bits when using base-2 logarithms.
Information theory quantifies uncertainty and information using measures like entropy, cross-entropy, KL divergence, and mutual information.