Concepts4
πTheoryAdvanced
Variational Inference Theory
Variational Inference (VI) replaces an intractable posterior with a simpler distribution and optimizes it by minimizing KL divergence, which is equivalent to maximizing the ELBO.
#variational inference#elbo#kl divergence+12
πTheoryIntermediate
Mutual Information
Mutual Information (MI) measures how much knowing one random variable reduces uncertainty about another.
#mutual information#entropy#kl divergence+12
πTheoryIntermediate
KL Divergence (Kullback-Leibler Divergence)
KullbackβLeibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.
#kl divergence#kullback-leibler#cross-entropy+12
πTheoryIntermediate
Information Theory
Information theory quantifies uncertainty and information using measures like entropy, cross-entropy, KL divergence, and mutual information.
#entropy#cross-entropy#kl divergence+12