Concepts64
Category
Algorithmic Information Theory
Algorithmic Information Theory studies information content via the shortest programs that generate data, rather than via average-case probabilities.
Online Algorithm Theory
Online algorithms make decisions step by step without seeing the future and are judged against an all-knowing offline optimum.
Approximation Algorithm Theory
Approximation algorithms deliver provably near-optimal solutions for NP-hard optimization problems within guaranteed factors.
Randomized Algorithm Theory
Randomized algorithms use random bits to make choices that simplify design, avoid worst cases, and often speed up computation.
Amortized Analysis
Amortized analysis measures the average cost per operation over a worst-case sequence, not over random inputs.
Computability Theory
Computability theory studies the boundary between what can and cannot be computed by any algorithm.
Optimal Transport Theory
Optimal Transport (OT) formalizes the cheapest way to move one probability distribution into another given a cost to move mass.
Halting Problem
The Halting Problem asks whether a given program P will eventually stop when run on input x; there is no algorithm that correctly answers this for all P and x.
Diffusion Models Theory
Diffusion models learn to reverse a simple noising process by estimating the score (the gradient of the log density) of data at different noise levels.
NP-Completeness
NP-completeness classifies decision problems that are both in NP and as hard as any problem in NP via polynomial-time reductions.
Variational Inference Theory
Variational Inference (VI) replaces an intractable posterior with a simpler distribution and optimizes it by minimizing KL divergence, which is equivalent to maximizing the ELBO.
ELBO (Evidence Lower Bound)
The Evidence Lower Bound (ELBO) is a tractable lower bound on the log evidence log p(x) that enables learning and inference in latent variable models like VAEs.