Concepts3
πTheoryIntermediate
KL Divergence (Kullback-Leibler Divergence)
KullbackβLeibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.
#kl divergence#kullback-leibler#cross-entropy+12
πTheoryIntermediate
Shannon Entropy
Shannon entropy quantifies the average uncertainty or information content of a random variable in bits when using base-2 logarithms.
#shannon entropy#information gain#mutual information+12
πTheoryIntermediate
Information Theory
Information theory quantifies uncertainty and information using measures like entropy, cross-entropy, KL divergence, and mutual information.
#entropy#cross-entropy#kl divergence+12