🎓How I Study AIHISA
📖Read
📄Papers📰Blogs🎬Courses
💡Learn
🛤️Paths📚Topics💡Concepts🎴Shorts
🎯Practice
📝Daily Log🎯Prompts🧠Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

Concepts4

Groups

📐Linear Algebra15📈Calculus & Differentiation10🎯Optimization14🎲Probability Theory12📊Statistics for ML9📡Information Theory10🔺Convex Optimization7🔢Numerical Methods6🕸Graph Theory for Deep Learning6🔵Topology for ML5🌐Differential Geometry6∞Measure Theory & Functional Analysis6🎰Random Matrix Theory5🌊Fourier Analysis & Signal Processing9🎰Sampling & Monte Carlo Methods10🧠Deep Learning Theory12🛡️Regularization Theory11👁️Attention & Transformer Theory10🎨Generative Model Theory11🔮Representation Learning10🎮Reinforcement Learning Mathematics9🔄Variational Methods8📉Loss Functions & Objectives10⏱️Sequence & Temporal Models8💎Geometric Deep Learning8

Category

🔷All∑Math⚙️Algo🗂️DS📚Theory

Level

AllBeginner
📚TheoryAdvanced

Variational Dropout & Bayesian Deep Learning

Dropout can be interpreted as variational inference in a Bayesian neural network, where applying random masks approximates sampling from a posterior over weights.

#bayesian neural networks#variational inference#dropout+12
⚙️AlgorithmAdvanced

Stochastic Variational Inference

Stochastic Variational Inference (SVI) scales variational inference to large datasets by taking noisy but unbiased gradient steps using minibatches.

#stochastic variational inference
Intermediate
Advanced
Filtering by:
#reparameterization trick
Group:
Variational Methods
#elbo
#variational inference
+12
∑MathAdvanced

Evidence Lower Bound (ELBO)

The Evidence Lower Bound (ELBO) is a tractable lower bound on the log evidence log p(x) used to perform approximate Bayesian inference.

#elbo#variational inference#vae+12
📚TheoryIntermediate

Variational Inference

Variational Inference (VI) turns Bayesian inference into an optimization problem by choosing a simple family q(z) to approximate an intractable posterior p(z|x).

#variational inference#elbo#kl divergence+12