Groups
Category
Level
KL divergence measures how much information is lost when using model Q to approximate the true distribution P.
Rademacher complexity is a data-dependent measure of how well a function class can fit random noise on a given sample.