🎓How I Study AIHISA
📖Read
📄Papers📰Blogs🎬Courses
💡Learn
🛤️Paths📚Topics💡Concepts🎴Shorts
🎯Practice
📝Daily Log🎯Prompts🧠Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

Concepts25

Groups

📐Linear Algebra15📈Calculus & Differentiation10🎯Optimization14🎲Probability Theory12📊Statistics for ML9📡Information Theory10🔺Convex Optimization7🔢Numerical Methods6🕸Graph Theory for Deep Learning6🔵Topology for ML5🌐Differential Geometry6∞Measure Theory & Functional Analysis6🎰Random Matrix Theory5🌊Fourier Analysis & Signal Processing9🎰Sampling & Monte Carlo Methods10🧠Deep Learning Theory12🛡️Regularization Theory11👁️Attention & Transformer Theory10🎨Generative Model Theory11🔮Representation Learning10🎮Reinforcement Learning Mathematics9🔄Variational Methods8📉Loss Functions & Objectives10⏱️Sequence & Temporal Models8💎Geometric Deep Learning8

Category

🔷All∑Math⚙️Algo🗂️DS📚Theory

Level

AllBeginnerIntermediate
∑MathIntermediate

Law of Large Numbers

The Weak Law of Large Numbers (WLLN) says that the sample average of independent, identically distributed (i.i.d.) random variables with finite mean gets close to the true mean with high probability as the sample size grows.

#law of large numbers#weak law#sample mean+12
📚TheoryAdvanced

Normalizing Flow Variational Inference

Normalizing-flow variational inference enriches the variational family by transforming a simple base distribution through a sequence of invertible, differentiable mappings.

123
Advanced
Filtering by:
#monte carlo
#normalizing flows
#variational inference
#elbo
+12
∑MathAdvanced

Evidence Lower Bound (ELBO)

The Evidence Lower Bound (ELBO) is a tractable lower bound on the log evidence log p(x) used to perform approximate Bayesian inference.

#elbo#variational inference#vae+12
∑MathIntermediate

Discount Factor & Return

The discounted return G_t sums all future rewards but down-weights distant rewards by powers of a discount factor γ.

#discount factor#discounted return#reinforcement learning+12
∑MathAdvanced

Stochastic Differential Equations for Generation

A forward stochastic differential equation (SDE) models a state that drifts deterministically and is shaken by random Brownian noise over time.

#stochastic differential equation#diffusion model#euler maruyama+12
📚TheoryAdvanced

Variational Autoencoders (VAE) Theory

A Variational Autoencoder (VAE) is a probabilistic autoencoder that learns to generate data by inferring hidden causes (latent variables) and decoding them back to observations.

#variational autoencoder#elbo#kl divergence+12
📚TheoryAdvanced

Generalization Bounds for Deep Learning

Generalization bounds explain why deep neural networks can perform well on unseen data despite having many parameters.

#generalization bounds#pac-bayes#compression bounds+12
⚙️AlgorithmIntermediate

Stratified & Latin Hypercube Sampling

Stratified sampling reduces Monte Carlo variance by dividing the domain into non-overlapping regions (strata) and sampling within each region.

#stratified sampling#latin hypercube sampling#variance reduction+11
⚙️AlgorithmIntermediate

Rejection Sampling

Rejection sampling draws from a hard target distribution by using an easier proposal and accepting with probability p(x)/(M q(x)).

#rejection sampling#accept-reject#proposal distribution+11
⚙️AlgorithmIntermediate

Importance Sampling

Importance sampling rewrites an expectation under a hard-to-sample distribution p as an expectation under an easier distribution q, multiplied by a weight w = p/q.

#importance sampling#proposal distribution#self-normalized+12
⚙️AlgorithmIntermediate

Monte Carlo Estimation

Monte Carlo estimation approximates an expected value by averaging function values at random samples drawn from a probability distribution.

#monte carlo#expectation#variance reduction+12
∑MathAdvanced

Sigma-Algebras & Measure Spaces

A σ-algebra is a collection of subsets that is closed under complements and countable unions, giving us a stable universe of sets where measure makes sense.

#sigma-algebra#measure space#measurable sets+12