Concepts6
Concentration Inequalities
Concentration inequalities give high-probability bounds that random outcomes stay close to their expectations, even without knowing the full distribution.
ELBO (Evidence Lower Bound)
The Evidence Lower Bound (ELBO) is a tractable lower bound on the log evidence log p(x) that enables learning and inference in latent variable models like VAEs.
Mutual Information
Mutual Information (MI) measures how much knowing one random variable reduces uncertainty about another.
KL Divergence (Kullback-Leibler Divergence)
KullbackβLeibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.
Shannon Entropy
Shannon entropy quantifies the average uncertainty or information content of a random variable in bits when using base-2 logarithms.
Information Theory
Information theory quantifies uncertainty and information using measures like entropy, cross-entropy, KL divergence, and mutual information.