Groups
Category
Concentration inequalities give high-probability bounds that random outcomes stay close to their expectations, even without knowing the full distribution.
Information-theoretic lower bounds tell you the best possible performance any learning algorithm can achieve, regardless of cleverness or compute.