Groups
Category
Level
Minimum Description Length (MDL) picks the model that compresses the data best by minimizing L(M) + L(D|M).
Cross-entropy measures how well a proposed distribution Q predicts outcomes actually generated by a true distribution P.
KL divergence measures how much information is lost when using model Q to approximate the true distribution P.
Mutual Information (MI) measures how much knowing one random variable reduces uncertainty about another.
KullbackโLeibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.
Information theory quantifies uncertainty and information using measures like entropy, cross-entropy, KL divergence, and mutual information.