Groups
Category
Mutual Information (MI) measures how much knowing one random variable reduces uncertainty about another.
KullbackโLeibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.