Groups
Category
Level
Mutual Information (MI) measures how much knowing one random variable reduces uncertainty about another.
Shannon entropy quantifies the average uncertainty or information content of a random variable in bits when using base-2 logarithms.