πTheoryIntermediate
KL Divergence (Kullback-Leibler Divergence)
KullbackβLeibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.
#kl divergence#kullback-leibler#cross-entropy+12