๐ŸŽ“How I Study AIHISA
๐Ÿ“–Read
๐Ÿ“„Papers๐Ÿ“ฐBlogs๐ŸŽฌCourses
๐Ÿ’กLearn
๐Ÿ›ค๏ธPaths๐Ÿ“šTopics๐Ÿ’กConcepts๐ŸŽดShorts
๐ŸŽฏPractice
๐Ÿ“Daily Log๐ŸŽฏPrompts๐Ÿง Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

Concepts10

Groups

๐Ÿ“Linear Algebra15๐Ÿ“ˆCalculus & Differentiation10๐ŸŽฏOptimization14๐ŸŽฒProbability Theory12๐Ÿ“ŠStatistics for ML9๐Ÿ“กInformation Theory10๐Ÿ”บConvex Optimization7๐Ÿ”ขNumerical Methods6๐Ÿ•ธGraph Theory for Deep Learning6๐Ÿ”ตTopology for ML5๐ŸŒDifferential Geometry6โˆžMeasure Theory & Functional Analysis6๐ŸŽฐRandom Matrix Theory5๐ŸŒŠFourier Analysis & Signal Processing9๐ŸŽฐSampling & Monte Carlo Methods10๐Ÿง Deep Learning Theory12๐Ÿ›ก๏ธRegularization Theory11๐Ÿ‘๏ธAttention & Transformer Theory10๐ŸŽจGenerative Model Theory11๐Ÿ”ฎRepresentation Learning10๐ŸŽฎReinforcement Learning Mathematics9๐Ÿ”„Variational Methods8๐Ÿ“‰Loss Functions & Objectives10โฑ๏ธSequence & Temporal Models8๐Ÿ’ŽGeometric Deep Learning8

Category

๐Ÿ”ทAllโˆ‘Mathโš™๏ธAlgo๐Ÿ—‚๏ธDS๐Ÿ“šTheory

Level

AllBeginnerIntermediate
๐Ÿ“šTheoryAdvanced

Normalizing Flow Variational Inference

Normalizing-flow variational inference enriches the variational family by transforming a simple base distribution through a sequence of invertible, differentiable mappings.

#normalizing flows#variational inference#elbo+12
๐Ÿ“šTheoryIntermediate

Maximum Likelihood & Generative Models

Maximum Likelihood Estimation (MLE) picks parameters that make the observed data most probable under a chosen probabilistic model.

#maximum likelihood
Advanced
Filtering by:
#bayesian inference
#generative models
#naive bayes
+12
โš™๏ธAlgorithmAdvanced

Hamiltonian Monte Carlo (HMC)

Hamiltonian Monte Carlo (HMC) uses gradients of the log-density to propose long-distance moves that still land in high-probability regions.

#hamiltonian monte carlo#hmc#mcmc+11
โš™๏ธAlgorithmIntermediate

Gibbs Sampling

Gibbs sampling is an MCMC method that generates samples by repeatedly drawing each variable from its conditional distribution given the others.

#gibbs sampling#mcmc#markov chain+12
โš™๏ธAlgorithmIntermediate

Metropolis-Hastings Algorithm

Metropolisโ€“Hastings is a clever accept/reject method that lets you sample from complex probability distributions using only an unnormalized density.

#metropolis-hastings#mcmc#acceptance ratio+12
โš™๏ธAlgorithmIntermediate

Markov Chain Monte Carlo (MCMC)

MCMC builds a random walk (a Markov chain) whose long-run visiting frequency matches your target distribution, even when the target is only known up to a constant.

#mcmc#metropolis-hastings#gibbs sampling+12
โš™๏ธAlgorithmIntermediate

Importance Sampling

Importance sampling rewrites an expectation under a hard-to-sample distribution p as an expectation under an easier distribution q, multiplied by a weight w = p/q.

#importance sampling#proposal distribution#self-normalized+12
๐Ÿ“šTheoryIntermediate

Bayesian Inference

Bayesian inference updates prior beliefs with observed data to produce a posterior distribution P(\theta\mid D).

#bayesian inference#posterior#prior+12
โˆ‘MathIntermediate

Maximum A Posteriori (MAP) Estimation

Maximum A Posteriori (MAP) estimation chooses the parameter value with the highest posterior probability after seeing data.

#map estimation#posterior mode#bayesian inference+12
๐Ÿ“šTheoryAdvanced

MCMC Theory

MCMC simulates a Markov chain whose long-run behavior matches a target distribution, letting us sample from complex posteriors without knowing the normalization constant.

#mcmc#metropolis-hastings#gibbs sampling+11