πŸŽ“How I Study AIHISA
πŸ“–Read
πŸ“„PapersπŸ“°Blogs🎬Courses
πŸ’‘Learn
πŸ›€οΈPathsπŸ“šTopicsπŸ’‘Concepts🎴Shorts
🎯Practice
🧩Problems🎯Prompts🧠Review
Search

Concepts4

Category

πŸ”·Allβˆ‘Mathβš™οΈAlgoπŸ—‚οΈDSπŸ“šTheory

Level

AllBeginnerIntermediateAdvanced
Filtering by:
#entropy
πŸ“šTheoryAdvanced

Variational Inference Theory

Variational Inference (VI) replaces an intractable posterior with a simpler distribution and optimizes it by minimizing KL divergence, which is equivalent to maximizing the ELBO.

#variational inference#elbo#kl divergence+12
πŸ“šTheoryIntermediate

Mutual Information

Mutual Information (MI) measures how much knowing one random variable reduces uncertainty about another.

#mutual information#entropy#kl divergence+12
πŸ“šTheoryIntermediate

KL Divergence (Kullback-Leibler Divergence)

Kullback–Leibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.

#kl divergence#kullback-leibler#cross-entropy+12
πŸ“šTheoryIntermediate

Information Theory

Information theory quantifies uncertainty and information using measures like entropy, cross-entropy, KL divergence, and mutual information.

#entropy#cross-entropy#kl divergence+12