Groups
Category
Level
Rényi entropy generalizes Shannon entropy by measuring uncertainty with a tunable emphasis on common versus rare outcomes.
Mutual Information (MI) measures how much knowing one random variable reduces uncertainty about another.
Information theory quantifies uncertainty and information using measures like entropy, cross-entropy, KL divergence, and mutual information.