🎓How I Study AIHISA
📖Read
📄Papers📰Blogs🎬Courses
💡Learn
🛤️Paths📚Topics💡Concepts🎴Shorts
🎯Practice
📝Daily Log🎯Prompts🧠Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

Concepts6

Groups

📐Linear Algebra15📈Calculus & Differentiation10🎯Optimization14🎲Probability Theory12📊Statistics for ML9📡Information Theory10🔺Convex Optimization7🔢Numerical Methods6🕸Graph Theory for Deep Learning6🔵Topology for ML5🌐Differential Geometry6∞Measure Theory & Functional Analysis6🎰Random Matrix Theory5🌊Fourier Analysis & Signal Processing9🎰Sampling & Monte Carlo Methods10🧠Deep Learning Theory12🛡️Regularization Theory11👁️Attention & Transformer Theory10🎨Generative Model Theory11🔮Representation Learning10🎮Reinforcement Learning Mathematics9🔄Variational Methods8📉Loss Functions & Objectives10⏱️Sequence & Temporal Models8💎Geometric Deep Learning8

Category

🔷All∑Math⚙️Algo🗂️DS📚Theory

Level

AllBeginnerIntermediate
📚TheoryAdvanced

Normalizing Flow Variational Inference

Normalizing-flow variational inference enriches the variational family by transforming a simple base distribution through a sequence of invertible, differentiable mappings.

#normalizing flows#variational inference#elbo+12
∑MathAdvanced

Evidence Lower Bound (ELBO)

The Evidence Lower Bound (ELBO) is a tractable lower bound on the log evidence log p(x) used to perform approximate Bayesian inference.

#elbo
Advanced
Filtering by:
#monte carlo
#variational inference
#vae
+12
∑MathAdvanced

Stochastic Differential Equations for Generation

A forward stochastic differential equation (SDE) models a state that drifts deterministically and is shaken by random Brownian noise over time.

#stochastic differential equation#diffusion model#euler maruyama+12
📚TheoryAdvanced

Variational Autoencoders (VAE) Theory

A Variational Autoencoder (VAE) is a probabilistic autoencoder that learns to generate data by inferring hidden causes (latent variables) and decoding them back to observations.

#variational autoencoder#elbo#kl divergence+12
📚TheoryAdvanced

Generalization Bounds for Deep Learning

Generalization bounds explain why deep neural networks can perform well on unseen data despite having many parameters.

#generalization bounds#pac-bayes#compression bounds+12
∑MathAdvanced

Sigma-Algebras & Measure Spaces

A σ-algebra is a collection of subsets that is closed under complements and countable unions, giving us a stable universe of sets where measure makes sense.

#sigma-algebra#measure space#measurable sets+12