Concepts4
📚TheoryIntermediate
Concentration Inequalities
Concentration inequalities give high-probability bounds that random outcomes stay close to their expectations, even without knowing the full distribution.
#concentration inequalities#hoeffding inequality#chernoff bound+12
📚TheoryAdvanced
Statistical Learning Theory
Statistical learning theory explains why a model that fits training data can still predict well on unseen data by relating true risk to empirical risk plus a complexity term.
#statistical learning theory#empirical risk minimization#structural risk minimization+11
📚TheoryIntermediate
PAC Learning
PAC learning formalizes when a learner can probably (with probability at least 1−δ) and approximately (error at most ε) succeed using a polynomial number of samples.
#pac learning#agnostic learning#vc dimension+12
📚TheoryAdvanced
Rademacher Complexity
Rademacher complexity is a data-dependent measure of how well a function class can fit random noise on a given sample.
#rademacher complexity#empirical rademacher#generalization bounds+12