Concepts7
Information-Theoretic Lower Bounds
Information-theoretic lower bounds tell you the best possible performance any learning algorithm can achieve, regardless of cleverness or compute.
Representation Learning Theory
Representation learning aims to automatically discover features that make downstream tasks easy, often without human-provided labels.
Information Bottleneck Theory
Information Bottleneck (IB) studies how to compress an input X into a representation Z that still preserves what is needed to predict Y.
Contrastive Learning Theory
Contrastive learning learns representations by pulling together positive pairs and pushing apart negatives using a softmax-based objective.
Mutual Information
Mutual Information (MI) measures how much knowing one random variable reduces uncertainty about another.
Shannon Entropy
Shannon entropy quantifies the average uncertainty or information content of a random variable in bits when using base-2 logarithms.
Information Theory
Information theory quantifies uncertainty and information using measures like entropy, cross-entropy, KL divergence, and mutual information.