Concepts7

πŸ“šTheoryIntermediate

Randomized Algorithm Theory

Randomized algorithms use random bits to make choices that simplify design, avoid worst cases, and often speed up computation.

#randomized algorithms#las vegas#monte carlo+12
πŸ“šTheoryIntermediate

Probability Theory

Probability theory formalizes uncertainty using a sample space, events, and a probability measure that obeys clear axioms.

#probability measure#random variable#expectation+12
πŸ“šTheoryIntermediate

KL Divergence (Kullback-Leibler Divergence)

Kullback–Leibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.

#kl divergence#kullback-leibler#cross-entropy+12
βˆ‘MathIntermediate

Variance and Covariance

Variance measures how spread out a random variable is around its mean, while covariance measures how two variables move together.

#variance#covariance#standard deviation+12
βˆ‘MathIntermediate

Expected Value

Expected value is the long-run average outcome of a random variable if you could repeat the experiment many times.

#expected value#linearity of expectation#indicator variables+12
βˆ‘MathIntermediate

Probability Fundamentals

Probability quantifies uncertainty by assigning numbers between 0 and 1 to events in a sample space.

#probability#sample space#conditional probability+12
βš™οΈAlgorithmIntermediate

Randomized Algorithms

Randomized algorithms use coin flips (random bits) to guide choices, often making code simpler and fast on average.

#randomized algorithms#las vegas#monte carlo+12