Concepts280
Category
Mutual Information
Mutual Information (MI) measures how much knowing one random variable reduces uncertainty about another.
KL Divergence (Kullback-Leibler Divergence)
Kullback–Leibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.
Shannon Entropy
Shannon entropy quantifies the average uncertainty or information content of a random variable in bits when using base-2 logarithms.
Information Theory
Information theory quantifies uncertainty and information using measures like entropy, cross-entropy, KL divergence, and mutual information.
DP on Broken Profile - Plug DP
Plug DP (DP on broken profile with plugs) sweeps a grid cell by cell while remembering how partial path segments cross the frontier as labeled “plugs.”
Matrix Exponentiation - Advanced
Matrix exponentiation turns repeated linear transitions into fast O(n^{3} log k) computation using exponentiation by squaring.
Top Tree
Top trees are dynamic tree data structures that represent a forest as a hierarchy of clusters, allowing O(log n) amortized time for link, cut, path queries/updates, and many subtree queries.
Voronoi Diagram and Delaunay
Voronoi diagrams partition the plane so each region contains points closest to one site, while the Delaunay triangulation connects sites whose Voronoi cells touch.
Sprague-Grundy Theorem
Sprague–Grundy theory turns every finite impartial game (normal play) into an equivalent Nim heap with a size called the Grundy number.
Game Theory - Nim
Nim is a two-player impartial game with several piles where a move removes any positive number of stones from exactly one pile.
Game Theory - Calculation Techniques
Sprague–Grundy theory converts any impartial, normal-play game into an equivalent Nim heap using a Grundy number.
Variance and Covariance
Variance measures how spread out a random variable is around its mean, while covariance measures how two variables move together.