Concepts7
Randomized Algorithm Theory
Randomized algorithms use random bits to make choices that simplify design, avoid worst cases, and often speed up computation.
Probability Theory
Probability theory formalizes uncertainty using a sample space, events, and a probability measure that obeys clear axioms.
KL Divergence (Kullback-Leibler Divergence)
KullbackβLeibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.
Variance and Covariance
Variance measures how spread out a random variable is around its mean, while covariance measures how two variables move together.
Expected Value
Expected value is the long-run average outcome of a random variable if you could repeat the experiment many times.
Probability Fundamentals
Probability quantifies uncertainty by assigning numbers between 0 and 1 to events in a sample space.
Randomized Algorithms
Randomized algorithms use coin flips (random bits) to guide choices, often making code simpler and fast on average.