Groups
Category
KullbackโLeibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.
Information theory quantifies uncertainty and information using measures like entropy, cross-entropy, KL divergence, and mutual information.