Random sampling, MCMC, and importance sampling techniques essential for Bayesian inference, generative models, and RL.
10 concepts
Monte Carlo estimation approximates an expected value by averaging function values at random samples drawn from a probability distribution.
Importance sampling rewrites an expectation under a hard-to-sample distribution p as an expectation under an easier distribution q, multiplied by a weight w = p/q.
Rejection sampling draws from a hard target distribution by using an easier proposal and accepting with probability p(x)/(M q(x)).
MCMC builds a random walk (a Markov chain) whose long-run visiting frequency matches your target distribution, even when the target is only known up to a constant.
Metropolis–Hastings is a clever accept/reject method that lets you sample from complex probability distributions using only an unnormalized density.
Gibbs sampling is an MCMC method that generates samples by repeatedly drawing each variable from its conditional distribution given the others.
The reparameterization trick rewrites a random variable as a deterministic function of noise that does not depend on the parameters, such as z = μ + σ · ε with ε ~ N(0, 1).
Stratified sampling reduces Monte Carlo variance by dividing the domain into non-overlapping regions (strata) and sampling within each region.
Hamiltonian Monte Carlo (HMC) uses gradients of the log-density to propose long-distance moves that still land in high-probability regions.
Langevin dynamics is a noisy gradient-ascent method that moves particles toward high probability regions while adding Gaussian noise to ensure proper exploration.