Concepts64
Category
PAC-Bayes Theory
PAC-Bayes provides high-probability generalization bounds for randomized predictors by comparing a data-dependent posterior Q to a fixed, data-independent prior P through KL(Q||P).
Concentration Inequalities
Concentration inequalities give high-probability bounds that random outcomes stay close to their expectations, even without knowing the full distribution.
MCMC Theory
MCMC simulates a Markov chain whose long-run behavior matches a target distribution, letting us sample from complex posteriors without knowing the normalization constant.
Markov Chain Theory
A Markov chain is a random process where the next state depends only on the current state, not the full history.
Graph Neural Network Theory
Graph Neural Networks (GNNs) learn on graphs by repeatedly letting each node aggregate messages from its neighbors and update its representation.
Differential Privacy Theory
Differential privacy (DP) guarantees that the output of a randomized algorithm does not change much when one personโs data is added or removed.
Spectral Graph Theory
Spectral graph theory studies graphs by looking at eigenvalues and eigenvectors of matrices like the adjacency matrix A and Laplacians L and L_norm.
Information-Theoretic Lower Bounds
Information-theoretic lower bounds tell you the best possible performance any learning algorithm can achieve, regardless of cleverness or compute.
Quantum Computing Theory
Quantum computing uses qubits that can be in superpositions, enabling interference-based computation beyond classical bits.
Streaming Algorithm Theory
Streaming algorithms process massive data one pass at a time using sublinearโoften polylogarithmicโmemory.
Distributed Algorithm Theory
Distributed algorithm theory studies how many independent computers cooperate correctly and efficiently despite delays and failures.
Parallel Algorithm Theory
Parallel algorithm theory studies how to solve problems faster by coordinating many processors that share work and memory.